Mar 11 11:58:32 crc systemd[1]: Starting Kubernetes Kubelet... Mar 11 11:58:32 crc restorecon[4763]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:32 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 11:58:33 crc restorecon[4763]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 11:58:33 crc restorecon[4763]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 11 11:58:33 crc kubenswrapper[4816]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 11:58:33 crc kubenswrapper[4816]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 11 11:58:33 crc kubenswrapper[4816]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 11:58:33 crc kubenswrapper[4816]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 11:58:33 crc kubenswrapper[4816]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 11 11:58:33 crc kubenswrapper[4816]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.871997 4816 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882317 4816 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882355 4816 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882365 4816 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882376 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882386 4816 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882400 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882411 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882422 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882432 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882442 4816 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882453 4816 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882463 4816 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882475 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882484 4816 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882493 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882503 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882513 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882522 4816 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882532 4816 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882541 4816 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882550 4816 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882560 4816 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882575 4816 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882586 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882598 4816 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882608 4816 feature_gate.go:330] unrecognized feature gate: Example Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882619 4816 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882663 4816 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882674 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882682 4816 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882689 4816 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882698 4816 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882706 4816 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882714 4816 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882722 4816 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882729 4816 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882737 4816 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882745 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882752 4816 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882760 4816 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882768 4816 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882775 4816 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882784 4816 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882791 4816 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882799 4816 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882806 4816 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882814 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882822 4816 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882829 4816 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882837 4816 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882844 4816 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882853 4816 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882865 4816 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882874 4816 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882882 4816 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882890 4816 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882897 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882908 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882916 4816 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882926 4816 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882938 4816 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882949 4816 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882958 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882969 4816 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882977 4816 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882986 4816 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.882995 4816 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.883003 4816 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.883012 4816 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.883020 4816 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.883028 4816 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884209 4816 flags.go:64] FLAG: --address="0.0.0.0" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884238 4816 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884294 4816 flags.go:64] FLAG: --anonymous-auth="true" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884306 4816 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884318 4816 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884327 4816 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884339 4816 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884352 4816 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884363 4816 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884375 4816 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884390 4816 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884403 4816 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884416 4816 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884427 4816 flags.go:64] FLAG: --cgroup-root="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884439 4816 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884451 4816 flags.go:64] FLAG: --client-ca-file="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884462 4816 flags.go:64] FLAG: --cloud-config="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884473 4816 flags.go:64] FLAG: --cloud-provider="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884489 4816 flags.go:64] FLAG: --cluster-dns="[]" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884503 4816 flags.go:64] FLAG: --cluster-domain="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884513 4816 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884522 4816 flags.go:64] FLAG: --config-dir="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884531 4816 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884542 4816 flags.go:64] FLAG: --container-log-max-files="5" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884553 4816 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884563 4816 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884572 4816 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884581 4816 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884591 4816 flags.go:64] FLAG: --contention-profiling="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884601 4816 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884610 4816 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884620 4816 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884629 4816 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884641 4816 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884650 4816 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884660 4816 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884668 4816 flags.go:64] FLAG: --enable-load-reader="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884678 4816 flags.go:64] FLAG: --enable-server="true" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884687 4816 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884701 4816 flags.go:64] FLAG: --event-burst="100" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884710 4816 flags.go:64] FLAG: --event-qps="50" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884719 4816 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884729 4816 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884738 4816 flags.go:64] FLAG: --eviction-hard="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884749 4816 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884759 4816 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884769 4816 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884778 4816 flags.go:64] FLAG: --eviction-soft="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884787 4816 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884796 4816 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884806 4816 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884815 4816 flags.go:64] FLAG: --experimental-mounter-path="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884824 4816 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884834 4816 flags.go:64] FLAG: --fail-swap-on="true" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884843 4816 flags.go:64] FLAG: --feature-gates="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884882 4816 flags.go:64] FLAG: --file-check-frequency="20s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884893 4816 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884903 4816 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884912 4816 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884922 4816 flags.go:64] FLAG: --healthz-port="10248" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884933 4816 flags.go:64] FLAG: --help="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884942 4816 flags.go:64] FLAG: --hostname-override="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884951 4816 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884960 4816 flags.go:64] FLAG: --http-check-frequency="20s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884970 4816 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884979 4816 flags.go:64] FLAG: --image-credential-provider-config="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884988 4816 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.884997 4816 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885006 4816 flags.go:64] FLAG: --image-service-endpoint="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885014 4816 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885024 4816 flags.go:64] FLAG: --kube-api-burst="100" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885033 4816 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885043 4816 flags.go:64] FLAG: --kube-api-qps="50" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885053 4816 flags.go:64] FLAG: --kube-reserved="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885106 4816 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885115 4816 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885125 4816 flags.go:64] FLAG: --kubelet-cgroups="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885134 4816 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885146 4816 flags.go:64] FLAG: --lock-file="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885155 4816 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885165 4816 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885175 4816 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885200 4816 flags.go:64] FLAG: --log-json-split-stream="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885210 4816 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885219 4816 flags.go:64] FLAG: --log-text-split-stream="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885228 4816 flags.go:64] FLAG: --logging-format="text" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885238 4816 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885248 4816 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885304 4816 flags.go:64] FLAG: --manifest-url="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885314 4816 flags.go:64] FLAG: --manifest-url-header="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885326 4816 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885336 4816 flags.go:64] FLAG: --max-open-files="1000000" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885347 4816 flags.go:64] FLAG: --max-pods="110" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885357 4816 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885366 4816 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885378 4816 flags.go:64] FLAG: --memory-manager-policy="None" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885390 4816 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885402 4816 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885413 4816 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885425 4816 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885450 4816 flags.go:64] FLAG: --node-status-max-images="50" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885459 4816 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885469 4816 flags.go:64] FLAG: --oom-score-adj="-999" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885478 4816 flags.go:64] FLAG: --pod-cidr="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885488 4816 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885501 4816 flags.go:64] FLAG: --pod-manifest-path="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885510 4816 flags.go:64] FLAG: --pod-max-pids="-1" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885519 4816 flags.go:64] FLAG: --pods-per-core="0" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885529 4816 flags.go:64] FLAG: --port="10250" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885538 4816 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885547 4816 flags.go:64] FLAG: --provider-id="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885557 4816 flags.go:64] FLAG: --qos-reserved="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885566 4816 flags.go:64] FLAG: --read-only-port="10255" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885575 4816 flags.go:64] FLAG: --register-node="true" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885586 4816 flags.go:64] FLAG: --register-schedulable="true" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885595 4816 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885617 4816 flags.go:64] FLAG: --registry-burst="10" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885626 4816 flags.go:64] FLAG: --registry-qps="5" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885637 4816 flags.go:64] FLAG: --reserved-cpus="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885646 4816 flags.go:64] FLAG: --reserved-memory="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885658 4816 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885668 4816 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885678 4816 flags.go:64] FLAG: --rotate-certificates="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885687 4816 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885696 4816 flags.go:64] FLAG: --runonce="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885704 4816 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885714 4816 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885724 4816 flags.go:64] FLAG: --seccomp-default="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885734 4816 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885763 4816 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885773 4816 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885783 4816 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885793 4816 flags.go:64] FLAG: --storage-driver-password="root" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885802 4816 flags.go:64] FLAG: --storage-driver-secure="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885811 4816 flags.go:64] FLAG: --storage-driver-table="stats" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885820 4816 flags.go:64] FLAG: --storage-driver-user="root" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885830 4816 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885840 4816 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885850 4816 flags.go:64] FLAG: --system-cgroups="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885859 4816 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885874 4816 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885883 4816 flags.go:64] FLAG: --tls-cert-file="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885892 4816 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885903 4816 flags.go:64] FLAG: --tls-min-version="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885912 4816 flags.go:64] FLAG: --tls-private-key-file="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885921 4816 flags.go:64] FLAG: --topology-manager-policy="none" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885930 4816 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885939 4816 flags.go:64] FLAG: --topology-manager-scope="container" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885949 4816 flags.go:64] FLAG: --v="2" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885961 4816 flags.go:64] FLAG: --version="false" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885972 4816 flags.go:64] FLAG: --vmodule="" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885983 4816 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.885992 4816 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886214 4816 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886231 4816 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886241 4816 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886283 4816 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886294 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886303 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886313 4816 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886321 4816 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886329 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886339 4816 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886347 4816 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886356 4816 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886364 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886376 4816 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886387 4816 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886400 4816 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886410 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886420 4816 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886433 4816 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886445 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886457 4816 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886470 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886480 4816 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886489 4816 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886500 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886510 4816 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886520 4816 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886529 4816 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886540 4816 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886550 4816 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886560 4816 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886570 4816 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886580 4816 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886590 4816 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886600 4816 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886610 4816 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886620 4816 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886632 4816 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886640 4816 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886648 4816 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886656 4816 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886665 4816 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886673 4816 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886681 4816 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886688 4816 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886696 4816 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886706 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886716 4816 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886726 4816 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886736 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886746 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886755 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886765 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886774 4816 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886783 4816 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886793 4816 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886803 4816 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886814 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886821 4816 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886829 4816 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886837 4816 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886844 4816 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886856 4816 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886866 4816 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886877 4816 feature_gate.go:330] unrecognized feature gate: Example Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886886 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886896 4816 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886904 4816 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886913 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886921 4816 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.886929 4816 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.887755 4816 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.901684 4816 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.901732 4816 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901827 4816 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901836 4816 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901843 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901848 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901853 4816 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901860 4816 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901865 4816 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901870 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901875 4816 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901880 4816 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901886 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901890 4816 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901896 4816 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901901 4816 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901906 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901911 4816 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901917 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901922 4816 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901927 4816 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901932 4816 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901937 4816 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901945 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901950 4816 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901958 4816 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901966 4816 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901972 4816 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901978 4816 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901984 4816 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901991 4816 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.901996 4816 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902002 4816 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902007 4816 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902012 4816 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902019 4816 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902024 4816 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902029 4816 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902034 4816 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902039 4816 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902044 4816 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902049 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902054 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902059 4816 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902064 4816 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902069 4816 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902075 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902080 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902087 4816 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902094 4816 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902099 4816 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902105 4816 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902110 4816 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902115 4816 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902120 4816 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902125 4816 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902130 4816 feature_gate.go:330] unrecognized feature gate: Example Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902135 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902140 4816 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902145 4816 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902150 4816 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902157 4816 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902164 4816 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902169 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902175 4816 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902180 4816 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902186 4816 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902191 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902198 4816 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902204 4816 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902210 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902215 4816 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902221 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.902230 4816 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902412 4816 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902419 4816 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902425 4816 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902430 4816 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902436 4816 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902441 4816 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902447 4816 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902452 4816 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902457 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902462 4816 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902467 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902472 4816 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902477 4816 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902482 4816 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902491 4816 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902497 4816 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902503 4816 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902508 4816 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902514 4816 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902519 4816 feature_gate.go:330] unrecognized feature gate: Example Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902524 4816 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902529 4816 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902534 4816 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902539 4816 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902544 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902549 4816 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902554 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902559 4816 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902565 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902570 4816 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902575 4816 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902580 4816 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902585 4816 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902590 4816 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902596 4816 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902601 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902606 4816 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902611 4816 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902616 4816 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902621 4816 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902626 4816 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902631 4816 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902636 4816 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902641 4816 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902646 4816 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902652 4816 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902657 4816 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902662 4816 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902667 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902672 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902677 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902683 4816 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902690 4816 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902697 4816 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902703 4816 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902708 4816 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902713 4816 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902719 4816 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902725 4816 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902731 4816 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902736 4816 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902742 4816 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902747 4816 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902751 4816 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902757 4816 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902762 4816 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902767 4816 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902772 4816 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902777 4816 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902783 4816 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 11:58:33 crc kubenswrapper[4816]: W0311 11:58:33.902788 4816 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.902795 4816 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.904975 4816 server.go:940] "Client rotation is on, will bootstrap in background" Mar 11 11:58:33 crc kubenswrapper[4816]: E0311 11:58:33.909371 4816 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.913510 4816 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.913618 4816 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.914912 4816 server.go:997] "Starting client certificate rotation" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.914932 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.915124 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.937495 4816 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.941323 4816 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 11 11:58:33 crc kubenswrapper[4816]: E0311 11:58:33.941959 4816 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.94:6443: connect: connection refused" logger="UnhandledError" Mar 11 11:58:33 crc kubenswrapper[4816]: I0311 11:58:33.966458 4816 log.go:25] "Validated CRI v1 runtime API" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.001833 4816 log.go:25] "Validated CRI v1 image API" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.003893 4816 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.011100 4816 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-11-11-45-36-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.011128 4816 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.027307 4816 manager.go:217] Machine: {Timestamp:2026-03-11 11:58:34.02254616 +0000 UTC m=+0.613810147 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:bbfa0147-7ad8-4a96-81ed-304e5bc4397b BootID:91fc6571-6a6d-490b-83e1-c64cf62c773c Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:7b:9a:6f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:7b:9a:6f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:19:db:72 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:e2:73:6d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:93:12:e7 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:1d:2a:d8 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:cb:dc:ed Speed:-1 Mtu:1496} {Name:eth10 MacAddress:6a:87:8b:1a:63:aa Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:7a:1d:34:3c:77:3e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.027520 4816 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.027672 4816 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.028008 4816 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.028187 4816 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.028219 4816 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.028469 4816 topology_manager.go:138] "Creating topology manager with none policy" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.028483 4816 container_manager_linux.go:303] "Creating device plugin manager" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.029333 4816 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.029365 4816 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.030313 4816 state_mem.go:36] "Initialized new in-memory state store" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.030404 4816 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.034627 4816 kubelet.go:418] "Attempting to sync node with API server" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.034648 4816 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.034672 4816 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.034683 4816 kubelet.go:324] "Adding apiserver pod source" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.034708 4816 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 11 11:58:34 crc kubenswrapper[4816]: W0311 11:58:34.039762 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.94:6443: connect: connection refused Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.039835 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.94:6443: connect: connection refused" logger="UnhandledError" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.040061 4816 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 11 11:58:34 crc kubenswrapper[4816]: W0311 11:58:34.040374 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.94:6443: connect: connection refused Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.040517 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.94:6443: connect: connection refused" logger="UnhandledError" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.040731 4816 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.042062 4816 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.044087 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.044110 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.044121 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.044130 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.044143 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.044151 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.044159 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.044173 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.044184 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.044195 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.044280 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.044294 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.046405 4816 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.046861 4816 server.go:1280] "Started kubelet" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.051805 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.94:6443: connect: connection refused Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.047628 4816 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.051869 4816 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 11 11:58:34 crc systemd[1]: Started Kubernetes Kubelet. Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.054094 4816 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.063653 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.063696 4816 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.064237 4816 server.go:460] "Adding debug handlers to kubelet server" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.064350 4816 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.064398 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.064411 4816 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.064436 4816 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.064927 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="200ms" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.065144 4816 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.065165 4816 factory.go:55] Registering systemd factory Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.065174 4816 factory.go:221] Registration of the systemd container factory successfully Mar 11 11:58:34 crc kubenswrapper[4816]: W0311 11:58:34.065767 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.94:6443: connect: connection refused Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.065919 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.94:6443: connect: connection refused" logger="UnhandledError" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.068397 4816 factory.go:153] Registering CRI-O factory Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.068414 4816 factory.go:221] Registration of the crio container factory successfully Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.068441 4816 factory.go:103] Registering Raw factory Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.068456 4816 manager.go:1196] Started watching for new ooms in manager Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.069020 4816 manager.go:319] Starting recovery of all containers Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.066343 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.94:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189bc7935db75286 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.046837382 +0000 UTC m=+0.638101349,LastTimestamp:2026-03-11 11:58:34.046837382 +0000 UTC m=+0.638101349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.072893 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.072957 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.072989 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073011 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073030 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073051 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073072 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073095 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073117 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073137 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073161 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073183 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073203 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073227 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073281 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073305 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073326 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073346 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073367 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073388 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073408 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073429 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073448 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073469 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073490 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073512 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073540 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073565 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073585 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073608 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073628 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073648 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073669 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073690 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073742 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073769 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073794 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073815 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073835 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073855 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073877 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073896 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073919 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073941 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073967 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.073997 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074022 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074044 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074151 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074179 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074200 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074222 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074283 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074312 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074337 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074357 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074379 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074406 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074437 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074461 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074480 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074506 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074534 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074558 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074578 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074600 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074620 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074645 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074670 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074689 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074731 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074758 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074782 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074806 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074836 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074862 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074891 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074917 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074937 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074957 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074979 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.074999 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075019 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075042 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075064 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075085 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075105 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075126 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075148 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075175 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075201 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075226 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075277 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075303 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075325 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075349 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075370 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075390 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075411 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075436 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075457 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075477 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075497 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075516 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075546 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075570 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.075596 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.083690 4816 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.083801 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.083842 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.083885 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.083929 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.083958 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.083997 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084031 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084054 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084077 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084103 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084124 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084147 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084171 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084277 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084307 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084326 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084356 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084376 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084396 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084422 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084670 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084829 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084877 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084930 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084959 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.084991 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085022 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085047 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085082 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085113 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085150 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085193 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085222 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085306 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085350 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085476 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085550 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085585 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085619 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085654 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085678 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085795 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085867 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085893 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085915 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085944 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.085970 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.086001 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.086028 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.086051 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.086082 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.086106 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.086130 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.086149 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.086170 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.086198 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.086220 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.086273 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.086297 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.086317 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087314 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087355 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087383 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087406 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087431 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087459 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087480 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087501 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087524 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087546 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087568 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087590 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087612 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087634 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087668 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087690 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087710 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087732 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087753 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087776 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087798 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087819 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087839 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087859 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087879 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087900 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087921 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087945 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087967 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.087989 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.088013 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.088035 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.088056 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.088077 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.088098 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.088119 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.088140 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.088160 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.088181 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.088400 4816 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.088439 4816 reconstruct.go:97] "Volume reconstruction finished" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.088459 4816 reconciler.go:26] "Reconciler: start to sync state" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.095562 4816 manager.go:324] Recovery completed Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.107999 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.115420 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.116039 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.116172 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.118440 4816 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.118588 4816 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.118713 4816 state_mem.go:36] "Initialized new in-memory state store" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.126508 4816 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.129067 4816 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.129163 4816 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.129222 4816 kubelet.go:2335] "Starting kubelet main sync loop" Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.129560 4816 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 11 11:58:34 crc kubenswrapper[4816]: W0311 11:58:34.130508 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.94:6443: connect: connection refused Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.130648 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.94:6443: connect: connection refused" logger="UnhandledError" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.141424 4816 policy_none.go:49] "None policy: Start" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.143737 4816 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.143768 4816 state_mem.go:35] "Initializing new in-memory state store" Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.164494 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.206676 4816 manager.go:334] "Starting Device Plugin manager" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.206779 4816 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.206800 4816 server.go:79] "Starting device plugin registration server" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.207358 4816 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.207398 4816 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.207651 4816 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.207744 4816 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.207753 4816 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.217644 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.230273 4816 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.230414 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.231896 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.231934 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.231950 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.232137 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.232361 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.232436 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.233473 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.233516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.233538 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.233773 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.234508 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.234566 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.234990 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.235028 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.235039 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.236983 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.237023 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.237037 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.238195 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.238309 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.238327 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.238591 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.238780 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.238834 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.239960 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.239998 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.240010 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.240374 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.240420 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.240440 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.240625 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.240818 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.240874 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.242487 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.242524 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.242539 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.242750 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.242788 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.243496 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.243536 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.243555 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.243644 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.243673 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.243685 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.265723 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="400ms" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293146 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293190 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293224 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293259 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293296 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293357 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293410 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293649 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293776 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293854 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293881 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293916 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293952 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293977 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.293999 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.307754 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.309203 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.309266 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.309278 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.309311 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.309837 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.94:6443: connect: connection refused" node="crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.394978 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.395609 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.395645 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.395670 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.395234 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.395743 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.395699 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.395755 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.395698 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.395834 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.395911 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.395953 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.395972 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.395996 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396015 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396014 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396037 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396058 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396028 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396079 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396067 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396092 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396103 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396099 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396059 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396145 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396152 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396164 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396155 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.396179 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.510979 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.513755 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.513809 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.513828 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.513874 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.514612 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.94:6443: connect: connection refused" node="crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.573639 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.583702 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.607940 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: W0311 11:58:34.616354 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-b874cff09eb556477e96dfd8a963d358010a4a1a7761d72c9dc0e89d24be063f WatchSource:0}: Error finding container b874cff09eb556477e96dfd8a963d358010a4a1a7761d72c9dc0e89d24be063f: Status 404 returned error can't find the container with id b874cff09eb556477e96dfd8a963d358010a4a1a7761d72c9dc0e89d24be063f Mar 11 11:58:34 crc kubenswrapper[4816]: W0311 11:58:34.623369 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-2b59e887a7ce0a443d2d5f1acfcef502029b574121b2ab53d03d260bd2a7e506 WatchSource:0}: Error finding container 2b59e887a7ce0a443d2d5f1acfcef502029b574121b2ab53d03d260bd2a7e506: Status 404 returned error can't find the container with id 2b59e887a7ce0a443d2d5f1acfcef502029b574121b2ab53d03d260bd2a7e506 Mar 11 11:58:34 crc kubenswrapper[4816]: W0311 11:58:34.635178 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0463f4f61100ae836a490830fe1159cd971bb0c679020c61d4b0fb433a7fb3f1 WatchSource:0}: Error finding container 0463f4f61100ae836a490830fe1159cd971bb0c679020c61d4b0fb433a7fb3f1: Status 404 returned error can't find the container with id 0463f4f61100ae836a490830fe1159cd971bb0c679020c61d4b0fb433a7fb3f1 Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.639466 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.646720 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 11:58:34 crc kubenswrapper[4816]: W0311 11:58:34.660378 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-2f1cd61e47ec20849bbaf37967fc5fceb2300bad5f64ee5680afccbd2934ef73 WatchSource:0}: Error finding container 2f1cd61e47ec20849bbaf37967fc5fceb2300bad5f64ee5680afccbd2934ef73: Status 404 returned error can't find the container with id 2f1cd61e47ec20849bbaf37967fc5fceb2300bad5f64ee5680afccbd2934ef73 Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.667287 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="800ms" Mar 11 11:58:34 crc kubenswrapper[4816]: W0311 11:58:34.668201 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-4c3d8a183b97e8c0efe668c45b1740e05e8ecfad716c7774fb634b7d55883f98 WatchSource:0}: Error finding container 4c3d8a183b97e8c0efe668c45b1740e05e8ecfad716c7774fb634b7d55883f98: Status 404 returned error can't find the container with id 4c3d8a183b97e8c0efe668c45b1740e05e8ecfad716c7774fb634b7d55883f98 Mar 11 11:58:34 crc kubenswrapper[4816]: W0311 11:58:34.911614 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.94:6443: connect: connection refused Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.911763 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.94:6443: connect: connection refused" logger="UnhandledError" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.915193 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.917133 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.917189 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.917199 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:34 crc kubenswrapper[4816]: I0311 11:58:34.917222 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:58:34 crc kubenswrapper[4816]: E0311 11:58:34.917755 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.94:6443: connect: connection refused" node="crc" Mar 11 11:58:35 crc kubenswrapper[4816]: I0311 11:58:35.052915 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.94:6443: connect: connection refused Mar 11 11:58:35 crc kubenswrapper[4816]: I0311 11:58:35.134594 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2f1cd61e47ec20849bbaf37967fc5fceb2300bad5f64ee5680afccbd2934ef73"} Mar 11 11:58:35 crc kubenswrapper[4816]: I0311 11:58:35.135556 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0463f4f61100ae836a490830fe1159cd971bb0c679020c61d4b0fb433a7fb3f1"} Mar 11 11:58:35 crc kubenswrapper[4816]: I0311 11:58:35.136546 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b874cff09eb556477e96dfd8a963d358010a4a1a7761d72c9dc0e89d24be063f"} Mar 11 11:58:35 crc kubenswrapper[4816]: I0311 11:58:35.137559 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2b59e887a7ce0a443d2d5f1acfcef502029b574121b2ab53d03d260bd2a7e506"} Mar 11 11:58:35 crc kubenswrapper[4816]: I0311 11:58:35.138527 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4c3d8a183b97e8c0efe668c45b1740e05e8ecfad716c7774fb634b7d55883f98"} Mar 11 11:58:35 crc kubenswrapper[4816]: W0311 11:58:35.291123 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.94:6443: connect: connection refused Mar 11 11:58:35 crc kubenswrapper[4816]: E0311 11:58:35.291231 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.94:6443: connect: connection refused" logger="UnhandledError" Mar 11 11:58:35 crc kubenswrapper[4816]: W0311 11:58:35.357892 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.94:6443: connect: connection refused Mar 11 11:58:35 crc kubenswrapper[4816]: E0311 11:58:35.357978 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.94:6443: connect: connection refused" logger="UnhandledError" Mar 11 11:58:35 crc kubenswrapper[4816]: W0311 11:58:35.361969 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.94:6443: connect: connection refused Mar 11 11:58:35 crc kubenswrapper[4816]: E0311 11:58:35.362115 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.94:6443: connect: connection refused" logger="UnhandledError" Mar 11 11:58:35 crc kubenswrapper[4816]: E0311 11:58:35.468850 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="1.6s" Mar 11 11:58:35 crc kubenswrapper[4816]: I0311 11:58:35.718608 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:35 crc kubenswrapper[4816]: I0311 11:58:35.720232 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:35 crc kubenswrapper[4816]: I0311 11:58:35.720273 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:35 crc kubenswrapper[4816]: I0311 11:58:35.720284 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:35 crc kubenswrapper[4816]: I0311 11:58:35.720310 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:58:35 crc kubenswrapper[4816]: E0311 11:58:35.720534 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.94:6443: connect: connection refused" node="crc" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.052487 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.94:6443: connect: connection refused Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.090036 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 11:58:36 crc kubenswrapper[4816]: E0311 11:58:36.091358 4816 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.94:6443: connect: connection refused" logger="UnhandledError" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.143065 4816 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f" exitCode=0 Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.143152 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f"} Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.143270 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.144779 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.144935 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.145057 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.146764 4816 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887" exitCode=0 Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.146882 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887"} Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.146985 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.148169 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.148211 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.148226 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.149642 4816 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="5450cf860ac6062e8e65a5f6ebbd10e10aad98425b0f474ac750abdd1dfa1505" exitCode=0 Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.149700 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"5450cf860ac6062e8e65a5f6ebbd10e10aad98425b0f474ac750abdd1dfa1505"} Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.149824 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.151005 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.151030 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.151039 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.154731 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.154907 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"946a1d6cf8ed454d4615ab005379cf71bde975ca610b807c92b5afb46c2e0342"} Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.154936 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"92c40a0d13af85e1b387e45914de68688a5edfe41eb9e0e396346750a8793df4"} Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.154947 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8e9eb0cfef5e2c64252348119e3fe8ab3b681b4698c01e20a4a8b63aff9fc40c"} Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.154959 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4cc365d25b754728795b200a155fa9bd64393ac8ec89f832fb06fc0f17e72cb5"} Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.156541 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.156567 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.156586 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.157959 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38" exitCode=0 Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.157993 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38"} Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.158066 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.158874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.158897 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.158907 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.162117 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.163074 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.163104 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.163115 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.409979 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:36 crc kubenswrapper[4816]: I0311 11:58:36.425522 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:36 crc kubenswrapper[4816]: W0311 11:58:36.917658 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.94:6443: connect: connection refused Mar 11 11:58:36 crc kubenswrapper[4816]: E0311 11:58:36.917801 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.94:6443: connect: connection refused" logger="UnhandledError" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.052482 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.94:6443: connect: connection refused Mar 11 11:58:37 crc kubenswrapper[4816]: E0311 11:58:37.070122 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="3.2s" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.141593 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.163430 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a88590983ebe134de5b905bd7616bcb2f9c324e5c18ec89715a0514cbd4db6c6"} Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.163485 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d44001bffd4656a8d0552394bf1901b81e5f1d70cfb20ecaa61452d4ef1b5eed"} Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.163505 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"189a32e7a45f34d012b4a89001af9cef090710a64de0ed750adc7d99bcd4cbe1"} Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.163484 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.164425 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.164464 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.164475 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.166719 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aadf85da9420b13a17645a1a0b0f3df80b67815ea93eb75f443c9abc547ebd12"} Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.166760 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56"} Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.166775 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594"} Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.166786 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.166788 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3"} Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.167063 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46"} Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.167895 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.167916 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.167925 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.169316 4816 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0" exitCode=0 Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.169349 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0"} Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.169516 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.171068 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.171696 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.172027 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c"} Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.172183 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.172210 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.172222 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.172961 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.172986 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.172997 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.173031 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.173072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.173092 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.320843 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.324956 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.324990 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.325003 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:37 crc kubenswrapper[4816]: I0311 11:58:37.325045 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:58:37 crc kubenswrapper[4816]: E0311 11:58:37.325765 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.94:6443: connect: connection refused" node="crc" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.178083 4816 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0" exitCode=0 Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.178196 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.178228 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.178332 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.178362 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0"} Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.178564 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.178750 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.178771 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.178332 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.179941 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.179966 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.179975 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.180744 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.180760 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.180767 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.180840 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.180840 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.180901 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.180917 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.180936 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.180920 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.181292 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.181311 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:38 crc kubenswrapper[4816]: I0311 11:58:38.181323 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:39 crc kubenswrapper[4816]: I0311 11:58:39.189060 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066"} Mar 11 11:58:39 crc kubenswrapper[4816]: I0311 11:58:39.189135 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4"} Mar 11 11:58:39 crc kubenswrapper[4816]: I0311 11:58:39.189166 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633"} Mar 11 11:58:39 crc kubenswrapper[4816]: I0311 11:58:39.189192 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e"} Mar 11 11:58:39 crc kubenswrapper[4816]: I0311 11:58:39.410995 4816 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 11:58:39 crc kubenswrapper[4816]: I0311 11:58:39.411118 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 11:58:40 crc kubenswrapper[4816]: I0311 11:58:40.197467 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea"} Mar 11 11:58:40 crc kubenswrapper[4816]: I0311 11:58:40.197651 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:40 crc kubenswrapper[4816]: I0311 11:58:40.199079 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:40 crc kubenswrapper[4816]: I0311 11:58:40.199140 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:40 crc kubenswrapper[4816]: I0311 11:58:40.199162 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:40 crc kubenswrapper[4816]: I0311 11:58:40.224153 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 11:58:40 crc kubenswrapper[4816]: I0311 11:58:40.526731 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:40 crc kubenswrapper[4816]: I0311 11:58:40.528383 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:40 crc kubenswrapper[4816]: I0311 11:58:40.528441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:40 crc kubenswrapper[4816]: I0311 11:58:40.528463 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:40 crc kubenswrapper[4816]: I0311 11:58:40.528502 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:58:41 crc kubenswrapper[4816]: I0311 11:58:41.200924 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:41 crc kubenswrapper[4816]: I0311 11:58:41.202734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:41 crc kubenswrapper[4816]: I0311 11:58:41.202799 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:41 crc kubenswrapper[4816]: I0311 11:58:41.202819 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:41 crc kubenswrapper[4816]: I0311 11:58:41.279225 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:41 crc kubenswrapper[4816]: I0311 11:58:41.279760 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 11:58:41 crc kubenswrapper[4816]: I0311 11:58:41.279937 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:41 crc kubenswrapper[4816]: I0311 11:58:41.281664 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:41 crc kubenswrapper[4816]: I0311 11:58:41.281717 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:41 crc kubenswrapper[4816]: I0311 11:58:41.281737 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:41 crc kubenswrapper[4816]: I0311 11:58:41.964924 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:42 crc kubenswrapper[4816]: I0311 11:58:42.203871 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 11:58:42 crc kubenswrapper[4816]: I0311 11:58:42.203974 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:42 crc kubenswrapper[4816]: I0311 11:58:42.205566 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:42 crc kubenswrapper[4816]: I0311 11:58:42.205625 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:42 crc kubenswrapper[4816]: I0311 11:58:42.205646 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.191768 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.192079 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.193951 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.194007 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.194029 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.571512 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.571793 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.573575 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.573620 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.573638 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.587455 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.587669 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.589316 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.589381 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:43 crc kubenswrapper[4816]: I0311 11:58:43.589406 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:44 crc kubenswrapper[4816]: E0311 11:58:44.217970 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 11:58:44 crc kubenswrapper[4816]: I0311 11:58:44.461726 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:44 crc kubenswrapper[4816]: I0311 11:58:44.462083 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:44 crc kubenswrapper[4816]: I0311 11:58:44.464331 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:44 crc kubenswrapper[4816]: I0311 11:58:44.464430 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:44 crc kubenswrapper[4816]: I0311 11:58:44.464492 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:44 crc kubenswrapper[4816]: I0311 11:58:44.473778 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:44 crc kubenswrapper[4816]: I0311 11:58:44.960401 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 11 11:58:44 crc kubenswrapper[4816]: I0311 11:58:44.961657 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:44 crc kubenswrapper[4816]: I0311 11:58:44.963576 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:44 crc kubenswrapper[4816]: I0311 11:58:44.963652 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:44 crc kubenswrapper[4816]: I0311 11:58:44.963679 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:45 crc kubenswrapper[4816]: I0311 11:58:45.215438 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:45 crc kubenswrapper[4816]: I0311 11:58:45.217400 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:45 crc kubenswrapper[4816]: I0311 11:58:45.217484 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:45 crc kubenswrapper[4816]: I0311 11:58:45.217506 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:45 crc kubenswrapper[4816]: I0311 11:58:45.222770 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:46 crc kubenswrapper[4816]: I0311 11:58:46.219096 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:46 crc kubenswrapper[4816]: I0311 11:58:46.220944 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:46 crc kubenswrapper[4816]: I0311 11:58:46.221031 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:46 crc kubenswrapper[4816]: I0311 11:58:46.221061 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:48 crc kubenswrapper[4816]: W0311 11:58:48.016902 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.017041 4816 trace.go:236] Trace[1064924096]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Mar-2026 11:58:38.015) (total time: 10001ms): Mar 11 11:58:48 crc kubenswrapper[4816]: Trace[1064924096]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:58:48.016) Mar 11 11:58:48 crc kubenswrapper[4816]: Trace[1064924096]: [10.001229493s] [10.001229493s] END Mar 11 11:58:48 crc kubenswrapper[4816]: E0311 11:58:48.017072 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.054174 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 11 11:58:48 crc kubenswrapper[4816]: W0311 11:58:48.098077 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.098216 4816 trace.go:236] Trace[951351368]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Mar-2026 11:58:38.096) (total time: 10001ms): Mar 11 11:58:48 crc kubenswrapper[4816]: Trace[951351368]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:58:48.098) Mar 11 11:58:48 crc kubenswrapper[4816]: Trace[951351368]: [10.001902436s] [10.001902436s] END Mar 11 11:58:48 crc kubenswrapper[4816]: E0311 11:58:48.098278 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.226150 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.228375 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="aadf85da9420b13a17645a1a0b0f3df80b67815ea93eb75f443c9abc547ebd12" exitCode=255 Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.228459 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"aadf85da9420b13a17645a1a0b0f3df80b67815ea93eb75f443c9abc547ebd12"} Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.228670 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.229451 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.229480 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.229490 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.229977 4816 scope.go:117] "RemoveContainer" containerID="aadf85da9420b13a17645a1a0b0f3df80b67815ea93eb75f443c9abc547ebd12" Mar 11 11:58:48 crc kubenswrapper[4816]: W0311 11:58:48.263744 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.263851 4816 trace.go:236] Trace[1718663093]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Mar-2026 11:58:38.261) (total time: 10001ms): Mar 11 11:58:48 crc kubenswrapper[4816]: Trace[1718663093]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:58:48.263) Mar 11 11:58:48 crc kubenswrapper[4816]: Trace[1718663093]: [10.001870355s] [10.001870355s] END Mar 11 11:58:48 crc kubenswrapper[4816]: E0311 11:58:48.263878 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 11 11:58:48 crc kubenswrapper[4816]: E0311 11:58:48.268309 4816 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 11:58:48 crc kubenswrapper[4816]: E0311 11:58:48.270537 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:48Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 11 11:58:48 crc kubenswrapper[4816]: E0311 11:58:48.275949 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:48Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.276685 4816 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.276735 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 11 11:58:48 crc kubenswrapper[4816]: W0311 11:58:48.277783 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:48Z is after 2026-02-23T05:33:13Z Mar 11 11:58:48 crc kubenswrapper[4816]: E0311 11:58:48.277836 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 11:58:48 crc kubenswrapper[4816]: E0311 11:58:48.279046 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:48Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189bc7935db75286 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.046837382 +0000 UTC m=+0.638101349,LastTimestamp:2026-03-11 11:58:34.046837382 +0000 UTC m=+0.638101349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.281083 4816 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.281153 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 11 11:58:48 crc kubenswrapper[4816]: I0311 11:58:48.370155 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:49 crc kubenswrapper[4816]: I0311 11:58:49.057360 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:49Z is after 2026-02-23T05:33:13Z Mar 11 11:58:49 crc kubenswrapper[4816]: I0311 11:58:49.233717 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 11:58:49 crc kubenswrapper[4816]: I0311 11:58:49.234343 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 11 11:58:49 crc kubenswrapper[4816]: I0311 11:58:49.237018 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a2d9a3cb72a07d0729cf0547bd3c77f1b4b47da54ab64802497189af73f6f7c0" exitCode=255 Mar 11 11:58:49 crc kubenswrapper[4816]: I0311 11:58:49.237058 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a2d9a3cb72a07d0729cf0547bd3c77f1b4b47da54ab64802497189af73f6f7c0"} Mar 11 11:58:49 crc kubenswrapper[4816]: I0311 11:58:49.237109 4816 scope.go:117] "RemoveContainer" containerID="aadf85da9420b13a17645a1a0b0f3df80b67815ea93eb75f443c9abc547ebd12" Mar 11 11:58:49 crc kubenswrapper[4816]: I0311 11:58:49.237149 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:49 crc kubenswrapper[4816]: I0311 11:58:49.238087 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:49 crc kubenswrapper[4816]: I0311 11:58:49.238121 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:49 crc kubenswrapper[4816]: I0311 11:58:49.238131 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:49 crc kubenswrapper[4816]: I0311 11:58:49.238622 4816 scope.go:117] "RemoveContainer" containerID="a2d9a3cb72a07d0729cf0547bd3c77f1b4b47da54ab64802497189af73f6f7c0" Mar 11 11:58:49 crc kubenswrapper[4816]: E0311 11:58:49.238767 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:58:49 crc kubenswrapper[4816]: I0311 11:58:49.411046 4816 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 11:58:49 crc kubenswrapper[4816]: I0311 11:58:49.411112 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 11:58:50 crc kubenswrapper[4816]: I0311 11:58:50.057658 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:50Z is after 2026-02-23T05:33:13Z Mar 11 11:58:50 crc kubenswrapper[4816]: I0311 11:58:50.242511 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 11:58:50 crc kubenswrapper[4816]: I0311 11:58:50.245155 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:50 crc kubenswrapper[4816]: I0311 11:58:50.246296 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:50 crc kubenswrapper[4816]: I0311 11:58:50.246506 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:50 crc kubenswrapper[4816]: I0311 11:58:50.246641 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:50 crc kubenswrapper[4816]: I0311 11:58:50.247551 4816 scope.go:117] "RemoveContainer" containerID="a2d9a3cb72a07d0729cf0547bd3c77f1b4b47da54ab64802497189af73f6f7c0" Mar 11 11:58:50 crc kubenswrapper[4816]: E0311 11:58:50.248005 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:58:51 crc kubenswrapper[4816]: I0311 11:58:51.057653 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:51Z is after 2026-02-23T05:33:13Z Mar 11 11:58:51 crc kubenswrapper[4816]: I0311 11:58:51.288668 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:51 crc kubenswrapper[4816]: I0311 11:58:51.290076 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:51 crc kubenswrapper[4816]: I0311 11:58:51.291923 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:51 crc kubenswrapper[4816]: I0311 11:58:51.291986 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:51 crc kubenswrapper[4816]: I0311 11:58:51.292006 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:51 crc kubenswrapper[4816]: I0311 11:58:51.293174 4816 scope.go:117] "RemoveContainer" containerID="a2d9a3cb72a07d0729cf0547bd3c77f1b4b47da54ab64802497189af73f6f7c0" Mar 11 11:58:51 crc kubenswrapper[4816]: E0311 11:58:51.293531 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:58:51 crc kubenswrapper[4816]: I0311 11:58:51.296788 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:51 crc kubenswrapper[4816]: W0311 11:58:51.544843 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:51Z is after 2026-02-23T05:33:13Z Mar 11 11:58:51 crc kubenswrapper[4816]: E0311 11:58:51.545008 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 11:58:52 crc kubenswrapper[4816]: I0311 11:58:52.057422 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:52Z is after 2026-02-23T05:33:13Z Mar 11 11:58:52 crc kubenswrapper[4816]: I0311 11:58:52.250244 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:52 crc kubenswrapper[4816]: I0311 11:58:52.251164 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:52 crc kubenswrapper[4816]: I0311 11:58:52.251273 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:52 crc kubenswrapper[4816]: I0311 11:58:52.251337 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:52 crc kubenswrapper[4816]: I0311 11:58:52.251919 4816 scope.go:117] "RemoveContainer" containerID="a2d9a3cb72a07d0729cf0547bd3c77f1b4b47da54ab64802497189af73f6f7c0" Mar 11 11:58:52 crc kubenswrapper[4816]: E0311 11:58:52.252374 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:58:53 crc kubenswrapper[4816]: I0311 11:58:53.058157 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:53Z is after 2026-02-23T05:33:13Z Mar 11 11:58:53 crc kubenswrapper[4816]: W0311 11:58:53.551890 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:53Z is after 2026-02-23T05:33:13Z Mar 11 11:58:53 crc kubenswrapper[4816]: E0311 11:58:53.552021 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 11:58:53 crc kubenswrapper[4816]: I0311 11:58:53.588129 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:53 crc kubenswrapper[4816]: I0311 11:58:53.588439 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:53 crc kubenswrapper[4816]: I0311 11:58:53.590305 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:53 crc kubenswrapper[4816]: I0311 11:58:53.590375 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:53 crc kubenswrapper[4816]: I0311 11:58:53.590396 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:53 crc kubenswrapper[4816]: I0311 11:58:53.591214 4816 scope.go:117] "RemoveContainer" containerID="a2d9a3cb72a07d0729cf0547bd3c77f1b4b47da54ab64802497189af73f6f7c0" Mar 11 11:58:53 crc kubenswrapper[4816]: E0311 11:58:53.591552 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:58:53 crc kubenswrapper[4816]: W0311 11:58:53.863435 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:53Z is after 2026-02-23T05:33:13Z Mar 11 11:58:53 crc kubenswrapper[4816]: E0311 11:58:53.863569 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 11:58:54 crc kubenswrapper[4816]: I0311 11:58:54.058114 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:54Z is after 2026-02-23T05:33:13Z Mar 11 11:58:54 crc kubenswrapper[4816]: E0311 11:58:54.218391 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 11:58:54 crc kubenswrapper[4816]: I0311 11:58:54.676482 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:54 crc kubenswrapper[4816]: E0311 11:58:54.676660 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:54Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 11:58:54 crc kubenswrapper[4816]: I0311 11:58:54.677957 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:54 crc kubenswrapper[4816]: I0311 11:58:54.677989 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:54 crc kubenswrapper[4816]: I0311 11:58:54.677999 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:54 crc kubenswrapper[4816]: I0311 11:58:54.678018 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:58:54 crc kubenswrapper[4816]: E0311 11:58:54.682616 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:54Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 11:58:54 crc kubenswrapper[4816]: I0311 11:58:54.998972 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 11 11:58:55 crc kubenswrapper[4816]: I0311 11:58:55.004179 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:55 crc kubenswrapper[4816]: I0311 11:58:55.007038 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:55 crc kubenswrapper[4816]: I0311 11:58:55.007312 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:55 crc kubenswrapper[4816]: I0311 11:58:55.007413 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:55 crc kubenswrapper[4816]: I0311 11:58:55.025101 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 11 11:58:55 crc kubenswrapper[4816]: I0311 11:58:55.056073 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:55Z is after 2026-02-23T05:33:13Z Mar 11 11:58:55 crc kubenswrapper[4816]: I0311 11:58:55.258183 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:55 crc kubenswrapper[4816]: I0311 11:58:55.259609 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:55 crc kubenswrapper[4816]: I0311 11:58:55.259660 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:55 crc kubenswrapper[4816]: I0311 11:58:55.259671 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:56 crc kubenswrapper[4816]: I0311 11:58:56.058181 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:56Z is after 2026-02-23T05:33:13Z Mar 11 11:58:56 crc kubenswrapper[4816]: I0311 11:58:56.792870 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 11:58:56 crc kubenswrapper[4816]: E0311 11:58:56.798894 4816 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 11:58:57 crc kubenswrapper[4816]: I0311 11:58:57.058131 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:57Z is after 2026-02-23T05:33:13Z Mar 11 11:58:58 crc kubenswrapper[4816]: I0311 11:58:58.057597 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:58Z is after 2026-02-23T05:33:13Z Mar 11 11:58:58 crc kubenswrapper[4816]: E0311 11:58:58.285166 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:58Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189bc7935db75286 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.046837382 +0000 UTC m=+0.638101349,LastTimestamp:2026-03-11 11:58:34.046837382 +0000 UTC m=+0.638101349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:58:58 crc kubenswrapper[4816]: I0311 11:58:58.370350 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:58:58 crc kubenswrapper[4816]: I0311 11:58:58.370631 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:58 crc kubenswrapper[4816]: I0311 11:58:58.372482 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:58 crc kubenswrapper[4816]: I0311 11:58:58.372574 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:58 crc kubenswrapper[4816]: I0311 11:58:58.372594 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:58 crc kubenswrapper[4816]: I0311 11:58:58.373508 4816 scope.go:117] "RemoveContainer" containerID="a2d9a3cb72a07d0729cf0547bd3c77f1b4b47da54ab64802497189af73f6f7c0" Mar 11 11:58:58 crc kubenswrapper[4816]: E0311 11:58:58.373829 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:58:59 crc kubenswrapper[4816]: I0311 11:58:59.057502 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:59Z is after 2026-02-23T05:33:13Z Mar 11 11:58:59 crc kubenswrapper[4816]: I0311 11:58:59.410933 4816 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 11:58:59 crc kubenswrapper[4816]: I0311 11:58:59.411055 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 11:58:59 crc kubenswrapper[4816]: I0311 11:58:59.411147 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:58:59 crc kubenswrapper[4816]: I0311 11:58:59.411412 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:58:59 crc kubenswrapper[4816]: I0311 11:58:59.413541 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:58:59 crc kubenswrapper[4816]: I0311 11:58:59.413620 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:58:59 crc kubenswrapper[4816]: I0311 11:58:59.413642 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:58:59 crc kubenswrapper[4816]: I0311 11:58:59.414427 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"8e9eb0cfef5e2c64252348119e3fe8ab3b681b4698c01e20a4a8b63aff9fc40c"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 11 11:58:59 crc kubenswrapper[4816]: I0311 11:58:59.414735 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://8e9eb0cfef5e2c64252348119e3fe8ab3b681b4698c01e20a4a8b63aff9fc40c" gracePeriod=30 Mar 11 11:58:59 crc kubenswrapper[4816]: W0311 11:58:59.975794 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:59Z is after 2026-02-23T05:33:13Z Mar 11 11:58:59 crc kubenswrapper[4816]: E0311 11:58:59.975910 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:58:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 11:59:00 crc kubenswrapper[4816]: I0311 11:59:00.056565 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:59:00Z is after 2026-02-23T05:33:13Z Mar 11 11:59:00 crc kubenswrapper[4816]: I0311 11:59:00.278815 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 11 11:59:00 crc kubenswrapper[4816]: I0311 11:59:00.279480 4816 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8e9eb0cfef5e2c64252348119e3fe8ab3b681b4698c01e20a4a8b63aff9fc40c" exitCode=255 Mar 11 11:59:00 crc kubenswrapper[4816]: I0311 11:59:00.279571 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8e9eb0cfef5e2c64252348119e3fe8ab3b681b4698c01e20a4a8b63aff9fc40c"} Mar 11 11:59:00 crc kubenswrapper[4816]: I0311 11:59:00.279657 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9f272527dcd9fc3245a557c4f0ed6ef8e764beb75628022fa8552c24893f4005"} Mar 11 11:59:00 crc kubenswrapper[4816]: I0311 11:59:00.279817 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:00 crc kubenswrapper[4816]: I0311 11:59:00.281191 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:00 crc kubenswrapper[4816]: I0311 11:59:00.281234 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:00 crc kubenswrapper[4816]: I0311 11:59:00.281267 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:00 crc kubenswrapper[4816]: W0311 11:59:00.307449 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:59:00Z is after 2026-02-23T05:33:13Z Mar 11 11:59:00 crc kubenswrapper[4816]: E0311 11:59:00.307534 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:59:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 11:59:00 crc kubenswrapper[4816]: W0311 11:59:00.534099 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:59:00Z is after 2026-02-23T05:33:13Z Mar 11 11:59:00 crc kubenswrapper[4816]: E0311 11:59:00.534242 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:59:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 11:59:01 crc kubenswrapper[4816]: I0311 11:59:01.058225 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:59:01Z is after 2026-02-23T05:33:13Z Mar 11 11:59:01 crc kubenswrapper[4816]: E0311 11:59:01.681890 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:59:01Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 11:59:01 crc kubenswrapper[4816]: I0311 11:59:01.683100 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:01 crc kubenswrapper[4816]: I0311 11:59:01.685175 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:01 crc kubenswrapper[4816]: I0311 11:59:01.685229 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:01 crc kubenswrapper[4816]: I0311 11:59:01.685266 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:01 crc kubenswrapper[4816]: I0311 11:59:01.685302 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:59:01 crc kubenswrapper[4816]: E0311 11:59:01.693279 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:59:01Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 11:59:02 crc kubenswrapper[4816]: I0311 11:59:02.057548 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:59:02Z is after 2026-02-23T05:33:13Z Mar 11 11:59:03 crc kubenswrapper[4816]: I0311 11:59:03.058520 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:59:03Z is after 2026-02-23T05:33:13Z Mar 11 11:59:04 crc kubenswrapper[4816]: I0311 11:59:04.057571 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:59:04Z is after 2026-02-23T05:33:13Z Mar 11 11:59:04 crc kubenswrapper[4816]: E0311 11:59:04.219560 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 11:59:05 crc kubenswrapper[4816]: I0311 11:59:05.055991 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:59:05Z is after 2026-02-23T05:33:13Z Mar 11 11:59:05 crc kubenswrapper[4816]: W0311 11:59:05.213505 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:59:05Z is after 2026-02-23T05:33:13Z Mar 11 11:59:05 crc kubenswrapper[4816]: E0311 11:59:05.213672 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T11:59:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 11:59:06 crc kubenswrapper[4816]: I0311 11:59:06.060829 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:06 crc kubenswrapper[4816]: I0311 11:59:06.410430 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:59:06 crc kubenswrapper[4816]: I0311 11:59:06.410710 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:06 crc kubenswrapper[4816]: I0311 11:59:06.412277 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:06 crc kubenswrapper[4816]: I0311 11:59:06.412329 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:06 crc kubenswrapper[4816]: I0311 11:59:06.412339 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:06 crc kubenswrapper[4816]: I0311 11:59:06.425864 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:59:07 crc kubenswrapper[4816]: I0311 11:59:07.061531 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:07 crc kubenswrapper[4816]: I0311 11:59:07.299805 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:07 crc kubenswrapper[4816]: I0311 11:59:07.301582 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:07 crc kubenswrapper[4816]: I0311 11:59:07.301674 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:07 crc kubenswrapper[4816]: I0311 11:59:07.301700 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:08 crc kubenswrapper[4816]: I0311 11:59:08.061551 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.293453 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc7935db75286 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.046837382 +0000 UTC m=+0.638101349,LastTimestamp:2026-03-11 11:58:34.046837382 +0000 UTC m=+0.638101349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.302221 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d70a2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116024874 +0000 UTC m=+0.707288881,LastTimestamp:2026-03-11 11:58:34.116024874 +0000 UTC m=+0.707288881,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.309077 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d92b2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116164396 +0000 UTC m=+0.707428383,LastTimestamp:2026-03-11 11:58:34.116164396 +0000 UTC m=+0.707428383,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.316355 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361db055b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116285787 +0000 UTC m=+0.707549784,LastTimestamp:2026-03-11 11:58:34.116285787 +0000 UTC m=+0.707549784,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.324000 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc793676c91cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.209710541 +0000 UTC m=+0.800974518,LastTimestamp:2026-03-11 11:58:34.209710541 +0000 UTC m=+0.800974518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.332407 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361d70a2a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d70a2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116024874 +0000 UTC m=+0.707288881,LastTimestamp:2026-03-11 11:58:34.231920569 +0000 UTC m=+0.823184546,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.339288 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361d92b2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d92b2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116164396 +0000 UTC m=+0.707428383,LastTimestamp:2026-03-11 11:58:34.231944869 +0000 UTC m=+0.823208846,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.346287 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361db055b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361db055b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116285787 +0000 UTC m=+0.707549784,LastTimestamp:2026-03-11 11:58:34.231956969 +0000 UTC m=+0.823220946,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.353716 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361d70a2a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d70a2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116024874 +0000 UTC m=+0.707288881,LastTimestamp:2026-03-11 11:58:34.233505287 +0000 UTC m=+0.824769274,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.360441 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361d92b2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d92b2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116164396 +0000 UTC m=+0.707428383,LastTimestamp:2026-03-11 11:58:34.233527907 +0000 UTC m=+0.824791894,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.367538 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361db055b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361db055b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116285787 +0000 UTC m=+0.707549784,LastTimestamp:2026-03-11 11:58:34.233546347 +0000 UTC m=+0.824810324,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.374987 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361d70a2a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d70a2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116024874 +0000 UTC m=+0.707288881,LastTimestamp:2026-03-11 11:58:34.235012553 +0000 UTC m=+0.826276520,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.383523 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361d92b2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d92b2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116164396 +0000 UTC m=+0.707428383,LastTimestamp:2026-03-11 11:58:34.235034624 +0000 UTC m=+0.826298581,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.387018 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361db055b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361db055b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116285787 +0000 UTC m=+0.707549784,LastTimestamp:2026-03-11 11:58:34.235046414 +0000 UTC m=+0.826310381,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.389001 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361d70a2a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d70a2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116024874 +0000 UTC m=+0.707288881,LastTimestamp:2026-03-11 11:58:34.237005336 +0000 UTC m=+0.828269313,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.395614 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361d92b2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d92b2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116164396 +0000 UTC m=+0.707428383,LastTimestamp:2026-03-11 11:58:34.237032066 +0000 UTC m=+0.828296043,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.402110 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361db055b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361db055b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116285787 +0000 UTC m=+0.707549784,LastTimestamp:2026-03-11 11:58:34.237044626 +0000 UTC m=+0.828308603,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.409315 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361d70a2a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d70a2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116024874 +0000 UTC m=+0.707288881,LastTimestamp:2026-03-11 11:58:34.238228849 +0000 UTC m=+0.829492826,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.417406 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361d92b2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d92b2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116164396 +0000 UTC m=+0.707428383,LastTimestamp:2026-03-11 11:58:34.2383206 +0000 UTC m=+0.829584577,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.424692 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361db055b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361db055b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116285787 +0000 UTC m=+0.707549784,LastTimestamp:2026-03-11 11:58:34.238336351 +0000 UTC m=+0.829600338,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.431615 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361d70a2a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d70a2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116024874 +0000 UTC m=+0.707288881,LastTimestamp:2026-03-11 11:58:34.239984209 +0000 UTC m=+0.831248186,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.438277 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361d92b2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d92b2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116164396 +0000 UTC m=+0.707428383,LastTimestamp:2026-03-11 11:58:34.240005179 +0000 UTC m=+0.831269156,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.443872 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361db055b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361db055b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116285787 +0000 UTC m=+0.707549784,LastTimestamp:2026-03-11 11:58:34.240017219 +0000 UTC m=+0.831281196,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.450861 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361d70a2a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d70a2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116024874 +0000 UTC m=+0.707288881,LastTimestamp:2026-03-11 11:58:34.240402304 +0000 UTC m=+0.831666291,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.458337 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bc79361d92b2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bc79361d92b2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.116164396 +0000 UTC m=+0.707428383,LastTimestamp:2026-03-11 11:58:34.240428954 +0000 UTC m=+0.831692931,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.466487 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc7937ffebb11 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.621942545 +0000 UTC m=+1.213206512,LastTimestamp:2026-03-11 11:58:34.621942545 +0000 UTC m=+1.213206512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.471733 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bc793803df309 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.626085641 +0000 UTC m=+1.217349608,LastTimestamp:2026-03-11 11:58:34.626085641 +0000 UTC m=+1.217349608,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.477481 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc79381119f74 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.639957876 +0000 UTC m=+1.231221853,LastTimestamp:2026-03-11 11:58:34.639957876 +0000 UTC m=+1.231221853,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.482645 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793826b7483 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.662622339 +0000 UTC m=+1.253886336,LastTimestamp:2026-03-11 11:58:34.662622339 +0000 UTC m=+1.253886336,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.488576 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bc79382efdb58 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:34.671299416 +0000 UTC m=+1.262563383,LastTimestamp:2026-03-11 11:58:34.671299416 +0000 UTC m=+1.262563383,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.490928 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bc793a2cc85b7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.205854647 +0000 UTC m=+1.797118614,LastTimestamp:2026-03-11 11:58:35.205854647 +0000 UTC m=+1.797118614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.496949 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc793a2e12288 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.207205512 +0000 UTC m=+1.798469479,LastTimestamp:2026-03-11 11:58:35.207205512 +0000 UTC m=+1.798469479,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.502661 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc793a2ec6448 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.20794324 +0000 UTC m=+1.799207207,LastTimestamp:2026-03-11 11:58:35.20794324 +0000 UTC m=+1.799207207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.508729 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bc793a2ed548f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.208004751 +0000 UTC m=+1.799268718,LastTimestamp:2026-03-11 11:58:35.208004751 +0000 UTC m=+1.799268718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.513841 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793a2f6a063 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.208613987 +0000 UTC m=+1.799877954,LastTimestamp:2026-03-11 11:58:35.208613987 +0000 UTC m=+1.799877954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.520024 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bc793a3dd54da openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.223733466 +0000 UTC m=+1.814997433,LastTimestamp:2026-03-11 11:58:35.223733466 +0000 UTC m=+1.814997433,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.524762 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc793a3ee258d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.224835469 +0000 UTC m=+1.816099436,LastTimestamp:2026-03-11 11:58:35.224835469 +0000 UTC m=+1.816099436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.530895 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc793a3eee7cf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.224885199 +0000 UTC m=+1.816149176,LastTimestamp:2026-03-11 11:58:35.224885199 +0000 UTC m=+1.816149176,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.536669 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bc793a3f45795 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.225241493 +0000 UTC m=+1.816505470,LastTimestamp:2026-03-11 11:58:35.225241493 +0000 UTC m=+1.816505470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.542692 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793a3f5bd20 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.225333024 +0000 UTC m=+1.816597001,LastTimestamp:2026-03-11 11:58:35.225333024 +0000 UTC m=+1.816597001,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.546698 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793a41afbb9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.227773881 +0000 UTC m=+1.819037848,LastTimestamp:2026-03-11 11:58:35.227773881 +0000 UTC m=+1.819037848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.553588 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793b692fe14 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.537628692 +0000 UTC m=+2.128892659,LastTimestamp:2026-03-11 11:58:35.537628692 +0000 UTC m=+2.128892659,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.560044 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793b7587e19 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.550572057 +0000 UTC m=+2.141836014,LastTimestamp:2026-03-11 11:58:35.550572057 +0000 UTC m=+2.141836014,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.564355 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793b76afc2c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.55178398 +0000 UTC m=+2.143047947,LastTimestamp:2026-03-11 11:58:35.55178398 +0000 UTC m=+2.143047947,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.570542 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793c174a902 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.72019021 +0000 UTC m=+2.311454187,LastTimestamp:2026-03-11 11:58:35.72019021 +0000 UTC m=+2.311454187,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.578600 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793c22598fd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.731785981 +0000 UTC m=+2.323049968,LastTimestamp:2026-03-11 11:58:35.731785981 +0000 UTC m=+2.323049968,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.583287 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793c238296a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.733002602 +0000 UTC m=+2.324266569,LastTimestamp:2026-03-11 11:58:35.733002602 +0000 UTC m=+2.324266569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.589522 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793cc1e67df openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.899086815 +0000 UTC m=+2.490350782,LastTimestamp:2026-03-11 11:58:35.899086815 +0000 UTC m=+2.490350782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.593927 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793ce12a501 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.931870465 +0000 UTC m=+2.523134452,LastTimestamp:2026-03-11 11:58:35.931870465 +0000 UTC m=+2.523134452,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.598508 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc793dae41843 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.146923587 +0000 UTC m=+2.738187574,LastTimestamp:2026-03-11 11:58:36.146923587 +0000 UTC m=+2.738187574,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.603300 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bc793db1ef7b5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.150781877 +0000 UTC m=+2.742045864,LastTimestamp:2026-03-11 11:58:36.150781877 +0000 UTC m=+2.742045864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.605653 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bc793db34bcc8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.152208584 +0000 UTC m=+2.743472551,LastTimestamp:2026-03-11 11:58:36.152208584 +0000 UTC m=+2.743472551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.612287 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc793dbc9eaf3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.161985267 +0000 UTC m=+2.753249234,LastTimestamp:2026-03-11 11:58:36.161985267 +0000 UTC m=+2.753249234,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.619549 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bc793e98c6dec openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.392836588 +0000 UTC m=+2.984100555,LastTimestamp:2026-03-11 11:58:36.392836588 +0000 UTC m=+2.984100555,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.624748 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc793e9ab6501 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.394865921 +0000 UTC m=+2.986129888,LastTimestamp:2026-03-11 11:58:36.394865921 +0000 UTC m=+2.986129888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.629350 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc793e9c7c22b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.396724779 +0000 UTC m=+2.987988756,LastTimestamp:2026-03-11 11:58:36.396724779 +0000 UTC m=+2.987988756,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.634902 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bc793e9d4e183 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.397584771 +0000 UTC m=+2.988848738,LastTimestamp:2026-03-11 11:58:36.397584771 +0000 UTC m=+2.988848738,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.640647 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bc793ea17f0c1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.401979585 +0000 UTC m=+2.993243552,LastTimestamp:2026-03-11 11:58:36.401979585 +0000 UTC m=+2.993243552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.647346 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bc793ea279730 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.403005232 +0000 UTC m=+2.994269199,LastTimestamp:2026-03-11 11:58:36.403005232 +0000 UTC m=+2.994269199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.652219 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc793eacced84 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.413840772 +0000 UTC m=+3.005104739,LastTimestamp:2026-03-11 11:58:36.413840772 +0000 UTC m=+3.005104739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.657409 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc793ead94553 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.414649683 +0000 UTC m=+3.005913660,LastTimestamp:2026-03-11 11:58:36.414649683 +0000 UTC m=+3.005913660,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.662684 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc793eb4dcbbf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.422286271 +0000 UTC m=+3.013550238,LastTimestamp:2026-03-11 11:58:36.422286271 +0000 UTC m=+3.013550238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.668216 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bc793eb90dedb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.426682075 +0000 UTC m=+3.017946042,LastTimestamp:2026-03-11 11:58:36.426682075 +0000 UTC m=+3.017946042,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.674710 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bc793f48ec812 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.577540114 +0000 UTC m=+3.168804081,LastTimestamp:2026-03-11 11:58:36.577540114 +0000 UTC m=+3.168804081,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.680434 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc793f49cd898 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.578461848 +0000 UTC m=+3.169725815,LastTimestamp:2026-03-11 11:58:36.578461848 +0000 UTC m=+3.169725815,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.686321 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.687393 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc793f55cccf7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.591041783 +0000 UTC m=+3.182305750,LastTimestamp:2026-03-11 11:58:36.591041783 +0000 UTC m=+3.182305750,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.692541 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bc793f57bab0c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.593064716 +0000 UTC m=+3.184328683,LastTimestamp:2026-03-11 11:58:36.593064716 +0000 UTC m=+3.184328683,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: I0311 11:59:08.694071 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.696682 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc793f5b0daf2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.596550386 +0000 UTC m=+3.187814353,LastTimestamp:2026-03-11 11:58:36.596550386 +0000 UTC m=+3.187814353,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: I0311 11:59:08.697582 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:08 crc kubenswrapper[4816]: I0311 11:59:08.697623 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:08 crc kubenswrapper[4816]: I0311 11:59:08.697635 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:08 crc kubenswrapper[4816]: I0311 11:59:08.697669 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.701661 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.701839 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bc793f5b68d8a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.596923786 +0000 UTC m=+3.188187753,LastTimestamp:2026-03-11 11:58:36.596923786 +0000 UTC m=+3.188187753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.706576 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bc794005e9d10 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.775709968 +0000 UTC m=+3.366973925,LastTimestamp:2026-03-11 11:58:36.775709968 +0000 UTC m=+3.366973925,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.711174 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc7940072db12 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.777036562 +0000 UTC m=+3.368300529,LastTimestamp:2026-03-11 11:58:36.777036562 +0000 UTC m=+3.368300529,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.716275 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bc7940160cace openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.792629966 +0000 UTC m=+3.383893933,LastTimestamp:2026-03-11 11:58:36.792629966 +0000 UTC m=+3.383893933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.720407 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc79401a1e753 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.796897107 +0000 UTC m=+3.388161064,LastTimestamp:2026-03-11 11:58:36.796897107 +0000 UTC m=+3.388161064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.724413 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc79401b0f68b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.797884043 +0000 UTC m=+3.389148010,LastTimestamp:2026-03-11 11:58:36.797884043 +0000 UTC m=+3.389148010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.730783 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc7940b7cb12b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.962230571 +0000 UTC m=+3.553494538,LastTimestamp:2026-03-11 11:58:36.962230571 +0000 UTC m=+3.553494538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.735395 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc7940c3c1714 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.974774036 +0000 UTC m=+3.566038003,LastTimestamp:2026-03-11 11:58:36.974774036 +0000 UTC m=+3.566038003,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.740861 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc7940c4d57c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.975904705 +0000 UTC m=+3.567168672,LastTimestamp:2026-03-11 11:58:36.975904705 +0000 UTC m=+3.567168672,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.745239 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc79415c8fb26 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:37.135002406 +0000 UTC m=+3.726266393,LastTimestamp:2026-03-11 11:58:37.135002406 +0000 UTC m=+3.726266393,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.750813 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc7941669747d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:37.145519229 +0000 UTC m=+3.736783196,LastTimestamp:2026-03-11 11:58:37.145519229 +0000 UTC m=+3.736783196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.759159 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc7941814a766 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:37.173516134 +0000 UTC m=+3.764780101,LastTimestamp:2026-03-11 11:58:37.173516134 +0000 UTC m=+3.764780101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.765976 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc794238de22b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:37.366010411 +0000 UTC m=+3.957274378,LastTimestamp:2026-03-11 11:58:37.366010411 +0000 UTC m=+3.957274378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.772062 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc794246fa453 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:37.380805715 +0000 UTC m=+3.972069682,LastTimestamp:2026-03-11 11:58:37.380805715 +0000 UTC m=+3.972069682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.778003 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc7945438d70a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:38.182520586 +0000 UTC m=+4.773784593,LastTimestamp:2026-03-11 11:58:38.182520586 +0000 UTC m=+4.773784593,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.785283 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc79461e1b49c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:38.411691164 +0000 UTC m=+5.002955171,LastTimestamp:2026-03-11 11:58:38.411691164 +0000 UTC m=+5.002955171,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.792891 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc794629ddef5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:38.424022773 +0000 UTC m=+5.015286770,LastTimestamp:2026-03-11 11:58:38.424022773 +0000 UTC m=+5.015286770,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.797664 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc79462b1eb3d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:38.425336637 +0000 UTC m=+5.016600644,LastTimestamp:2026-03-11 11:58:38.425336637 +0000 UTC m=+5.016600644,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.800909 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc7946ee93344 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:38.630286148 +0000 UTC m=+5.221550115,LastTimestamp:2026-03-11 11:58:38.630286148 +0000 UTC m=+5.221550115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.805369 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc7946fba318c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:38.643982732 +0000 UTC m=+5.235246709,LastTimestamp:2026-03-11 11:58:38.643982732 +0000 UTC m=+5.235246709,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.806521 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc7946fccb2dc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:38.645195484 +0000 UTC m=+5.236459461,LastTimestamp:2026-03-11 11:58:38.645195484 +0000 UTC m=+5.236459461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.810541 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc7947952496f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:38.804945263 +0000 UTC m=+5.396209230,LastTimestamp:2026-03-11 11:58:38.804945263 +0000 UTC m=+5.396209230,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.814112 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc79479d55465 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:38.813533285 +0000 UTC m=+5.404797252,LastTimestamp:2026-03-11 11:58:38.813533285 +0000 UTC m=+5.404797252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.818132 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc79479e926bf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:38.814832319 +0000 UTC m=+5.406096286,LastTimestamp:2026-03-11 11:58:38.814832319 +0000 UTC m=+5.406096286,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.822866 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc79485e672cb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.015981771 +0000 UTC m=+5.607245738,LastTimestamp:2026-03-11 11:58:39.015981771 +0000 UTC m=+5.607245738,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.828056 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc79486843ff9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.026323449 +0000 UTC m=+5.617587436,LastTimestamp:2026-03-11 11:58:39.026323449 +0000 UTC m=+5.617587436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.832233 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc79486972107 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.027560711 +0000 UTC m=+5.618824688,LastTimestamp:2026-03-11 11:58:39.027560711 +0000 UTC m=+5.618824688,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.835992 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc794944ec06d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.257698413 +0000 UTC m=+5.848962420,LastTimestamp:2026-03-11 11:58:39.257698413 +0000 UTC m=+5.848962420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.839692 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bc79495664386 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.276016518 +0000 UTC m=+5.867280525,LastTimestamp:2026-03-11 11:58:39.276016518 +0000 UTC m=+5.867280525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.844022 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 11:59:08 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d734768 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 11:59:08 crc kubenswrapper[4816]: body: Mar 11 11:59:08 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.411087208 +0000 UTC m=+6.002351215,LastTimestamp:2026-03-11 11:58:39.411087208 +0000 UTC m=+6.002351215,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 11:59:08 crc kubenswrapper[4816]: > Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.848167 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d74c202 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.41118413 +0000 UTC m=+6.002448147,LastTimestamp:2026-03-11 11:58:39.41118413 +0000 UTC m=+6.002448147,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.856610 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bc7940c4d57c1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc7940c4d57c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.975904705 +0000 UTC m=+3.567168672,LastTimestamp:2026-03-11 11:58:48.230863738 +0000 UTC m=+14.822127705,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.861187 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 11 11:59:08 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-apiserver-crc.189bc796ade212da openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 11 11:59:08 crc kubenswrapper[4816]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 11 11:59:08 crc kubenswrapper[4816]: Mar 11 11:59:08 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:48.276718298 +0000 UTC m=+14.867982265,LastTimestamp:2026-03-11 11:58:48.276718298 +0000 UTC m=+14.867982265,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 11:59:08 crc kubenswrapper[4816]: > Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.865566 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc796ade29af3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:48.276753139 +0000 UTC m=+14.868017106,LastTimestamp:2026-03-11 11:58:48.276753139 +0000 UTC m=+14.868017106,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.869898 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bc796ade212da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 11 11:59:08 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-apiserver-crc.189bc796ade212da openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 11 11:59:08 crc kubenswrapper[4816]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 11 11:59:08 crc kubenswrapper[4816]: Mar 11 11:59:08 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:48.276718298 +0000 UTC m=+14.867982265,LastTimestamp:2026-03-11 11:58:48.281131904 +0000 UTC m=+14.872395871,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 11:59:08 crc kubenswrapper[4816]: > Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.873768 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bc796ade29af3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc796ade29af3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:48.276753139 +0000 UTC m=+14.868017106,LastTimestamp:2026-03-11 11:58:48.281179745 +0000 UTC m=+14.872443712,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.877637 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bc79415c8fb26\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc79415c8fb26 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:37.135002406 +0000 UTC m=+3.726266393,LastTimestamp:2026-03-11 11:58:48.448588537 +0000 UTC m=+15.039852504,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.882052 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bc7941669747d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc7941669747d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:37.145519229 +0000 UTC m=+3.736783196,LastTimestamp:2026-03-11 11:58:48.460456216 +0000 UTC m=+15.051720183,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.886463 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc7949d734768\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 11:59:08 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d734768 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 11:59:08 crc kubenswrapper[4816]: body: Mar 11 11:59:08 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.411087208 +0000 UTC m=+6.002351215,LastTimestamp:2026-03-11 11:58:49.411094772 +0000 UTC m=+16.002358739,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 11:59:08 crc kubenswrapper[4816]: > Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.890409 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc7949d74c202\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d74c202 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.41118413 +0000 UTC m=+6.002448147,LastTimestamp:2026-03-11 11:58:49.411145233 +0000 UTC m=+16.002409200,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.898832 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc7949d734768\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 11:59:08 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d734768 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 11:59:08 crc kubenswrapper[4816]: body: Mar 11 11:59:08 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.411087208 +0000 UTC m=+6.002351215,LastTimestamp:2026-03-11 11:58:59.411002151 +0000 UTC m=+26.002266168,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 11:59:08 crc kubenswrapper[4816]: > Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.905131 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc7949d74c202\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d74c202 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.41118413 +0000 UTC m=+6.002448147,LastTimestamp:2026-03-11 11:58:59.411101904 +0000 UTC m=+26.002365901,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.908867 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc79945c23947 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:59.414702407 +0000 UTC m=+26.005966414,LastTimestamp:2026-03-11 11:58:59.414702407 +0000 UTC m=+26.005966414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.910735 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc793a41afbb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793a41afbb9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.227773881 +0000 UTC m=+1.819037848,LastTimestamp:2026-03-11 11:58:59.547968573 +0000 UTC m=+26.139232580,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.915099 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc793b692fe14\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793b692fe14 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.537628692 +0000 UTC m=+2.128892659,LastTimestamp:2026-03-11 11:58:59.809172545 +0000 UTC m=+26.400436512,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.921945 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc793b7587e19\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793b7587e19 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.550572057 +0000 UTC m=+2.141836014,LastTimestamp:2026-03-11 11:58:59.823575016 +0000 UTC m=+26.414838983,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.060789 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.130404 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.132018 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.132086 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.132108 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.132967 4816 scope.go:117] "RemoveContainer" containerID="a2d9a3cb72a07d0729cf0547bd3c77f1b4b47da54ab64802497189af73f6f7c0" Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.410761 4816 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.410879 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 11:59:09 crc kubenswrapper[4816]: E0311 11:59:09.419831 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc7949d734768\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 11:59:09 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d734768 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 11:59:09 crc kubenswrapper[4816]: body: Mar 11 11:59:09 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.411087208 +0000 UTC m=+6.002351215,LastTimestamp:2026-03-11 11:59:09.410843548 +0000 UTC m=+36.002107545,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 11:59:09 crc kubenswrapper[4816]: > Mar 11 11:59:09 crc kubenswrapper[4816]: E0311 11:59:09.425690 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc7949d74c202\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d74c202 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.41118413 +0000 UTC m=+6.002448147,LastTimestamp:2026-03-11 11:59:09.41091313 +0000 UTC m=+36.002177127,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.056708 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.310505 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.311418 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.313108 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773" exitCode=255 Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.313143 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773"} Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.313175 4816 scope.go:117] "RemoveContainer" containerID="a2d9a3cb72a07d0729cf0547bd3c77f1b4b47da54ab64802497189af73f6f7c0" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.313337 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.314232 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.314275 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.314285 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.314807 4816 scope.go:117] "RemoveContainer" containerID="10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773" Mar 11 11:59:10 crc kubenswrapper[4816]: E0311 11:59:10.314988 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:11 crc kubenswrapper[4816]: I0311 11:59:11.061237 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:11 crc kubenswrapper[4816]: I0311 11:59:11.319774 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 11:59:12 crc kubenswrapper[4816]: I0311 11:59:12.059271 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.061049 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.211235 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.225020 4816 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.587480 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.587671 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.588692 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.588746 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.588759 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.589204 4816 scope.go:117] "RemoveContainer" containerID="10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773" Mar 11 11:59:13 crc kubenswrapper[4816]: E0311 11:59:13.589407 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:14 crc kubenswrapper[4816]: I0311 11:59:14.056310 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:14 crc kubenswrapper[4816]: E0311 11:59:14.220519 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 11:59:15 crc kubenswrapper[4816]: I0311 11:59:15.058725 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:15 crc kubenswrapper[4816]: E0311 11:59:15.692859 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 11:59:15 crc kubenswrapper[4816]: I0311 11:59:15.701986 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:15 crc kubenswrapper[4816]: I0311 11:59:15.703738 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:15 crc kubenswrapper[4816]: I0311 11:59:15.703809 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:15 crc kubenswrapper[4816]: I0311 11:59:15.703835 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:15 crc kubenswrapper[4816]: I0311 11:59:15.703882 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:59:15 crc kubenswrapper[4816]: E0311 11:59:15.710698 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 11:59:16 crc kubenswrapper[4816]: I0311 11:59:16.055984 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:16 crc kubenswrapper[4816]: I0311 11:59:16.417717 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:59:16 crc kubenswrapper[4816]: I0311 11:59:16.417958 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:16 crc kubenswrapper[4816]: I0311 11:59:16.419650 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:16 crc kubenswrapper[4816]: I0311 11:59:16.419698 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:16 crc kubenswrapper[4816]: I0311 11:59:16.419713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:16 crc kubenswrapper[4816]: I0311 11:59:16.424384 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:59:17 crc kubenswrapper[4816]: I0311 11:59:17.056340 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:17 crc kubenswrapper[4816]: I0311 11:59:17.337738 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:17 crc kubenswrapper[4816]: I0311 11:59:17.339081 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:17 crc kubenswrapper[4816]: I0311 11:59:17.339270 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:17 crc kubenswrapper[4816]: I0311 11:59:17.339389 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:18 crc kubenswrapper[4816]: I0311 11:59:18.059562 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:18 crc kubenswrapper[4816]: I0311 11:59:18.370352 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:59:18 crc kubenswrapper[4816]: I0311 11:59:18.370558 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:18 crc kubenswrapper[4816]: I0311 11:59:18.371605 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:18 crc kubenswrapper[4816]: I0311 11:59:18.371660 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:18 crc kubenswrapper[4816]: I0311 11:59:18.371675 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:18 crc kubenswrapper[4816]: I0311 11:59:18.372176 4816 scope.go:117] "RemoveContainer" containerID="10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773" Mar 11 11:59:18 crc kubenswrapper[4816]: E0311 11:59:18.372358 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:18 crc kubenswrapper[4816]: W0311 11:59:18.789777 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 11 11:59:18 crc kubenswrapper[4816]: E0311 11:59:18.789843 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 11 11:59:19 crc kubenswrapper[4816]: I0311 11:59:19.057278 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:20 crc kubenswrapper[4816]: I0311 11:59:20.056737 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:20 crc kubenswrapper[4816]: W0311 11:59:20.727312 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:20 crc kubenswrapper[4816]: E0311 11:59:20.727367 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 11 11:59:21 crc kubenswrapper[4816]: I0311 11:59:21.055695 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:22 crc kubenswrapper[4816]: I0311 11:59:22.060316 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:22 crc kubenswrapper[4816]: E0311 11:59:22.696993 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 11:59:22 crc kubenswrapper[4816]: I0311 11:59:22.711072 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:22 crc kubenswrapper[4816]: I0311 11:59:22.712375 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:22 crc kubenswrapper[4816]: I0311 11:59:22.712440 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:22 crc kubenswrapper[4816]: I0311 11:59:22.712460 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:22 crc kubenswrapper[4816]: I0311 11:59:22.712501 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:59:22 crc kubenswrapper[4816]: E0311 11:59:22.716671 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 11:59:23 crc kubenswrapper[4816]: I0311 11:59:23.056644 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:23 crc kubenswrapper[4816]: I0311 11:59:23.576349 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 11:59:23 crc kubenswrapper[4816]: I0311 11:59:23.577194 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:23 crc kubenswrapper[4816]: I0311 11:59:23.578573 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:23 crc kubenswrapper[4816]: I0311 11:59:23.578608 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:23 crc kubenswrapper[4816]: I0311 11:59:23.578619 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:24 crc kubenswrapper[4816]: I0311 11:59:24.058825 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:24 crc kubenswrapper[4816]: E0311 11:59:24.221353 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 11:59:25 crc kubenswrapper[4816]: I0311 11:59:25.057153 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:25 crc kubenswrapper[4816]: W0311 11:59:25.213735 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 11 11:59:25 crc kubenswrapper[4816]: E0311 11:59:25.213792 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 11 11:59:25 crc kubenswrapper[4816]: W0311 11:59:25.878611 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 11 11:59:25 crc kubenswrapper[4816]: E0311 11:59:25.878669 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 11 11:59:26 crc kubenswrapper[4816]: I0311 11:59:26.057644 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:27 crc kubenswrapper[4816]: I0311 11:59:27.057707 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:28 crc kubenswrapper[4816]: I0311 11:59:28.057690 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.058189 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.129767 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.131452 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.131511 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.131532 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.132377 4816 scope.go:117] "RemoveContainer" containerID="10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773" Mar 11 11:59:29 crc kubenswrapper[4816]: E0311 11:59:29.132663 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:29 crc kubenswrapper[4816]: E0311 11:59:29.706181 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.717292 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.719435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.719495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.719522 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.719595 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:59:29 crc kubenswrapper[4816]: E0311 11:59:29.728339 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 11:59:30 crc kubenswrapper[4816]: I0311 11:59:30.058217 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:31 crc kubenswrapper[4816]: I0311 11:59:31.059482 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:32 crc kubenswrapper[4816]: I0311 11:59:32.060144 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:33 crc kubenswrapper[4816]: I0311 11:59:33.059166 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:34 crc kubenswrapper[4816]: I0311 11:59:34.059119 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:34 crc kubenswrapper[4816]: E0311 11:59:34.221812 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 11:59:35 crc kubenswrapper[4816]: I0311 11:59:35.059360 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:36 crc kubenswrapper[4816]: I0311 11:59:36.058620 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:36 crc kubenswrapper[4816]: E0311 11:59:36.714383 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 11:59:36 crc kubenswrapper[4816]: I0311 11:59:36.730547 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:36 crc kubenswrapper[4816]: I0311 11:59:36.732709 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:36 crc kubenswrapper[4816]: I0311 11:59:36.732755 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:36 crc kubenswrapper[4816]: I0311 11:59:36.732768 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:36 crc kubenswrapper[4816]: I0311 11:59:36.732800 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:59:36 crc kubenswrapper[4816]: E0311 11:59:36.739051 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 11:59:37 crc kubenswrapper[4816]: I0311 11:59:37.058659 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:38 crc kubenswrapper[4816]: I0311 11:59:38.059482 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:38 crc kubenswrapper[4816]: I0311 11:59:38.567749 4816 csr.go:261] certificate signing request csr-j57dc is approved, waiting to be issued Mar 11 11:59:38 crc kubenswrapper[4816]: I0311 11:59:38.578002 4816 csr.go:257] certificate signing request csr-j57dc is issued Mar 11 11:59:38 crc kubenswrapper[4816]: I0311 11:59:38.660848 4816 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 11 11:59:38 crc kubenswrapper[4816]: I0311 11:59:38.915607 4816 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 11 11:59:39 crc kubenswrapper[4816]: I0311 11:59:39.579732 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-02 00:43:05.175896057 +0000 UTC Mar 11 11:59:39 crc kubenswrapper[4816]: I0311 11:59:39.579811 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6372h43m25.596089728s for next certificate rotation Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.130092 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.131912 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.131982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.132007 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.133046 4816 scope.go:117] "RemoveContainer" containerID="10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.402708 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.406586 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd"} Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.406941 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.408599 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.408630 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.408641 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.410848 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.411383 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.412889 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" exitCode=255 Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.412923 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd"} Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.412956 4816 scope.go:117] "RemoveContainer" containerID="10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.413085 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.414036 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.414055 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.414072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.414523 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 11:59:41 crc kubenswrapper[4816]: E0311 11:59:41.414705 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:42 crc kubenswrapper[4816]: I0311 11:59:42.417019 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.588334 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.588690 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.590267 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.590306 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.590315 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.590933 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.591098 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.739826 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.741435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.741487 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.741504 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.741633 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.751705 4816 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.751952 4816 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.751976 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.755928 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.755959 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.755969 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.755985 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.755995 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:43Z","lastTransitionTime":"2026-03-11T11:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.768983 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.775671 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.775728 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.775743 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.775762 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.775775 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:43Z","lastTransitionTime":"2026-03-11T11:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.787099 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.796606 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.796661 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.796694 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.796716 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.796725 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:43Z","lastTransitionTime":"2026-03-11T11:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.807809 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.816824 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.816886 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.816909 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.816935 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.816953 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:43Z","lastTransitionTime":"2026-03-11T11:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.834794 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.835023 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.835077 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.936238 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.036736 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.136984 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.222829 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.238116 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.338554 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.439162 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.540411 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.641290 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.742448 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.842642 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.942911 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.044001 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.145101 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.246171 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.346887 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.447504 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.547646 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.648585 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.749000 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.849418 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.950444 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.051380 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.151834 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.251919 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.352838 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.453935 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.554369 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.654739 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.755677 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.855833 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.956156 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.056338 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.157041 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.257977 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.358091 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.459209 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.559648 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.660661 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.761769 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.862919 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.963316 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.063772 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.164745 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.265230 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.366047 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: I0311 11:59:48.370338 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:59:48 crc kubenswrapper[4816]: I0311 11:59:48.370522 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:48 crc kubenswrapper[4816]: I0311 11:59:48.371969 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:48 crc kubenswrapper[4816]: I0311 11:59:48.371997 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:48 crc kubenswrapper[4816]: I0311 11:59:48.372005 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:48 crc kubenswrapper[4816]: I0311 11:59:48.372499 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.372724 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.466602 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.566989 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.667691 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.768696 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.869949 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.970400 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.070917 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.171502 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.271938 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.372061 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.473144 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.573541 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.674595 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.775820 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.875934 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.976808 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.077846 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.178450 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.278653 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.379760 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.480625 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.581533 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.682508 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.783414 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.884323 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.984726 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.085609 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.185772 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.286420 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.386730 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.487585 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.587781 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.688968 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.789560 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: I0311 11:59:51.857895 4816 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.890188 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.990841 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.091711 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.192206 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.292931 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.393449 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.493880 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.594820 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.695877 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.796968 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.898077 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.999291 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.099520 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.199721 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.299871 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.401127 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.501394 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.602273 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.703377 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.803986 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.905219 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.006438 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.034997 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.041123 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.041178 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.041204 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.041238 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.041307 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:54Z","lastTransitionTime":"2026-03-11T11:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.059330 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.065228 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.065284 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.065307 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.065331 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.065352 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:54Z","lastTransitionTime":"2026-03-11T11:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.082306 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.087185 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.087230 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.087249 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.087292 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.087311 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:54Z","lastTransitionTime":"2026-03-11T11:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.103474 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.107697 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.107732 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.107743 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.107760 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.107773 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:54Z","lastTransitionTime":"2026-03-11T11:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.124296 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.124558 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.124610 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.223486 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.225677 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.326450 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.426550 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.526717 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.626939 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.728122 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.828914 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.929928 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.030216 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.130914 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.232156 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.332378 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.432564 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.533597 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.633754 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.734325 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.835456 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.936016 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:56 crc kubenswrapper[4816]: E0311 11:59:56.036322 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:56 crc kubenswrapper[4816]: E0311 11:59:56.137194 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:56 crc kubenswrapper[4816]: E0311 11:59:56.238118 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.273780 4816 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.340943 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.340977 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.340987 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.341003 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.341011 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:56Z","lastTransitionTime":"2026-03-11T11:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.443914 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.443967 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.443978 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.443995 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.444007 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:56Z","lastTransitionTime":"2026-03-11T11:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.545703 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.545739 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.545749 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.545762 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.545771 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:56Z","lastTransitionTime":"2026-03-11T11:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.648286 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.648324 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.648332 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.648345 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.648355 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:56Z","lastTransitionTime":"2026-03-11T11:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.751907 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.751948 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.751957 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.751973 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.751983 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:56Z","lastTransitionTime":"2026-03-11T11:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.855614 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.855735 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.855759 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.855783 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.855803 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:56Z","lastTransitionTime":"2026-03-11T11:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.959054 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.959118 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.959136 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.959160 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.959176 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:56Z","lastTransitionTime":"2026-03-11T11:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.062273 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.062356 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.062387 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.062417 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.062441 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.079332 4816 apiserver.go:52] "Watching apiserver" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.087172 4816 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.087699 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.088433 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.088585 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.088800 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.088892 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.088963 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.089768 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.089789 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.090040 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.090235 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.093212 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.093372 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.093397 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.093520 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.093540 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.094241 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.094306 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.094696 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.096584 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.135343 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.151603 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.164718 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.165472 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.165511 4816 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.165523 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.165656 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.165682 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.165699 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.175980 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176037 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176073 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176103 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176140 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176201 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176290 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176341 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176388 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176432 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176483 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176530 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176554 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176573 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176617 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176631 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176663 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176715 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176759 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176803 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176849 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176856 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176903 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176950 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176998 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177041 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177087 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177134 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177161 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177179 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177224 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177307 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177327 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177353 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177422 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177359 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177490 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177511 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177531 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177442 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177547 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177637 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177664 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177719 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177693 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.178075 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179147 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179203 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179413 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179494 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179757 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179902 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.178395 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.178747 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.178882 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.178926 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.178946 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179138 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179633 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179784 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179910 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179937 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180208 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180232 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180270 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180289 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180309 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180305 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180325 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180498 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180560 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180611 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180665 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180715 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180769 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180818 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180873 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180921 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180923 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180971 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181027 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180392 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181079 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181135 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181187 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181232 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181324 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181378 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181436 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181485 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181536 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181585 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181638 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181687 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181733 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181782 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181829 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181874 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181925 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181971 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182021 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182443 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182498 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182551 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182610 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182659 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182706 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182872 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182945 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183001 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183057 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183106 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183154 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183203 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183286 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183341 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183390 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183439 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183489 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183580 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183638 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183690 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183742 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183792 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183843 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183891 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183939 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183991 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184043 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184092 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184139 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184187 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184246 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184333 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184381 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184437 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184486 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184533 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184585 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184637 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184689 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184740 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184790 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184844 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184892 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184948 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184998 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185050 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185103 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185155 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185207 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185300 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185362 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185416 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185464 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185515 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185563 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185613 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185666 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185718 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185768 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185821 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185873 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185930 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185982 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186033 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186083 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186336 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186396 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186446 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186499 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186553 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186604 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188165 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188226 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188342 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188396 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188458 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188520 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188575 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188625 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188695 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188746 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188800 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188857 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188914 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180577 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180573 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180846 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181069 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181413 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181449 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181708 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181722 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.189093 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182111 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182230 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182298 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182451 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183018 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183718 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186949 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.187151 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.187752 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.187828 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188025 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188382 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188836 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188936 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.189525 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.189751 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.189646 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188968 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.191482 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192129 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192160 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192270 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192298 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192319 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192339 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192361 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192394 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192416 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192435 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192456 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192475 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192497 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192516 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192536 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192553 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192572 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192592 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192614 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192638 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192662 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192684 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192709 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192733 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192756 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192785 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192804 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192822 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192867 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192892 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192912 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192933 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192953 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192972 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192994 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193018 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193036 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193056 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193082 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193106 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193146 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193166 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193233 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193249 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193290 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193302 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193313 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193324 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193336 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193345 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193355 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193365 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193374 4816 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193384 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193393 4816 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193403 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193413 4816 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193423 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193433 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193443 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193452 4816 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193462 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193471 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193484 4816 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193494 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193505 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193515 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193524 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193534 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193543 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193553 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193563 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193572 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193581 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193593 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193603 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193613 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193622 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193632 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193641 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193651 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193661 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193670 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193681 4816 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193692 4816 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193702 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193711 4816 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193721 4816 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193730 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.191398 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193764 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192022 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192052 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192646 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192727 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193823 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193126 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.194407 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.194583 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.194724 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.194806 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 11:59:57.694769863 +0000 UTC m=+84.286033930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.195115 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.195245 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.196613 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.194817 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.194833 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.194867 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.200511 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.200735 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.200829 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:57.700802036 +0000 UTC m=+84.292066123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.200889 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.201819 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.201985 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.202057 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:57.702034541 +0000 UTC m=+84.293298518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.202235 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.202425 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.202570 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.202949 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.211568 4816 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.213948 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.214487 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.214543 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.216622 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.216757 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.217036 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.217851 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.219229 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.219313 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.219341 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.219598 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:57.719561663 +0000 UTC m=+84.310825690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.222033 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.222239 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.222302 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.222322 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.222393 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:57.722372373 +0000 UTC m=+84.313636380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.223392 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.224146 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.224644 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.224687 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.224733 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.225305 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.225560 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.225706 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.226104 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.226613 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.228517 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.229869 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.230707 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.230978 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.230985 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.231472 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.231823 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.232065 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.232484 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.232660 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.232794 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.233213 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.233600 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.234665 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.236177 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.236586 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.237583 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.237753 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.238386 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.239320 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.239347 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.239563 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.239824 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.240105 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.240849 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.240908 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.240859 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.242630 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.242855 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.243180 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.243198 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.243711 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.244926 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.245813 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.246504 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.246555 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.247402 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.247671 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.247703 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.248005 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.248019 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.248413 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.248444 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.248772 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.249187 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.249193 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.248931 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.249582 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.249682 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.250772 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.250818 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.251604 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.252118 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.252462 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.252688 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.252718 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.252912 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.252921 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.253587 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.253723 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.253958 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.254478 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.254575 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.254903 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.254899 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.255589 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.254389 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.256500 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.256773 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.256881 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257072 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257145 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257290 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257319 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257362 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257435 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257493 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257579 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257602 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.258003 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.258223 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.259150 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.259453 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.259779 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.259785 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.259905 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.259906 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.259923 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260250 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260308 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260401 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260508 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260965 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260512 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260832 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260853 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260865 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260908 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.261313 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.261563 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.261633 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.261790 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262151 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262166 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262575 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262617 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262632 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262730 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262860 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262949 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262962 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.270048 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.270512 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.270558 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.270568 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.270586 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.270603 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.275911 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.283926 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.284238 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295022 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295090 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295164 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295177 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295189 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295198 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295208 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295217 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295226 4816 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295235 4816 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295261 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295271 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295314 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295325 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295335 4816 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295347 4816 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295358 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295370 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295378 4816 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295387 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295396 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295405 4816 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295414 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295424 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295434 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295442 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295452 4816 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295462 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295472 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295482 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295491 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295501 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295511 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295520 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295529 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295538 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295547 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295555 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295565 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295574 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295584 4816 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295592 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295601 4816 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295611 4816 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295619 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295629 4816 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295638 4816 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295665 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295676 4816 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295687 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295696 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295706 4816 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295715 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295724 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295734 4816 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295742 4816 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295750 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295759 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295767 4816 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295777 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295786 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295796 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295805 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295816 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295827 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295839 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295851 4816 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295859 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295868 4816 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295876 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295887 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295926 4816 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295935 4816 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295944 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295952 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295961 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295973 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296002 4816 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296016 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296029 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296042 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296055 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296206 4816 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296226 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296237 4816 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296276 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296289 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296301 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296312 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296324 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296355 4816 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296368 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296382 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296376 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296394 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296460 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296515 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296541 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296554 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296567 4816 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296613 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296631 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296650 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296709 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296726 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296773 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296797 4816 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296813 4816 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296861 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296879 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296895 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296942 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296960 4816 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296649 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296976 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297086 4816 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297112 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297132 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297154 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296459 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297173 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297191 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297209 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297225 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297242 4816 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297313 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297331 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297351 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297370 4816 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297389 4816 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297407 4816 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297425 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297443 4816 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297462 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297479 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297496 4816 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297513 4816 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297530 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297548 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297565 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297582 4816 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297600 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297617 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297634 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297652 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297671 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297691 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297707 4816 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297728 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297750 4816 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297773 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.373646 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.373686 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.373698 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.373714 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.373725 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.398819 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.412082 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.425421 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.431194 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 11:59:57 crc kubenswrapper[4816]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 11 11:59:57 crc kubenswrapper[4816]: set -o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 11 11:59:57 crc kubenswrapper[4816]: source /etc/kubernetes/apiserver-url.env Mar 11 11:59:57 crc kubenswrapper[4816]: else Mar 11 11:59:57 crc kubenswrapper[4816]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 11 11:59:57 crc kubenswrapper[4816]: exit 1 Mar 11 11:59:57 crc kubenswrapper[4816]: fi Mar 11 11:59:57 crc kubenswrapper[4816]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 11 11:59:57 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 11:59:57 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.432396 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.438124 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: W0311 11:59:57.439738 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-0ae7d384c6c6aee635dc8d8b14f1b73fb77069547d903b12453a02c7ef21aa57 WatchSource:0}: Error finding container 0ae7d384c6c6aee635dc8d8b14f1b73fb77069547d903b12453a02c7ef21aa57: Status 404 returned error can't find the container with id 0ae7d384c6c6aee635dc8d8b14f1b73fb77069547d903b12453a02c7ef21aa57 Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.443582 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 11:59:57 crc kubenswrapper[4816]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 11 11:59:57 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 11 11:59:57 crc kubenswrapper[4816]: set -o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: source "/env/_master" Mar 11 11:59:57 crc kubenswrapper[4816]: set +o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: fi Mar 11 11:59:57 crc kubenswrapper[4816]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 11 11:59:57 crc kubenswrapper[4816]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 11 11:59:57 crc kubenswrapper[4816]: ho_enable="--enable-hybrid-overlay" Mar 11 11:59:57 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 11 11:59:57 crc kubenswrapper[4816]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 11 11:59:57 crc kubenswrapper[4816]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 11 11:59:57 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 11 11:59:57 crc kubenswrapper[4816]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --webhook-host=127.0.0.1 \ Mar 11 11:59:57 crc kubenswrapper[4816]: --webhook-port=9743 \ Mar 11 11:59:57 crc kubenswrapper[4816]: ${ho_enable} \ Mar 11 11:59:57 crc kubenswrapper[4816]: --enable-interconnect \ Mar 11 11:59:57 crc kubenswrapper[4816]: --disable-approver \ Mar 11 11:59:57 crc kubenswrapper[4816]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --wait-for-kubernetes-api=200s \ Mar 11 11:59:57 crc kubenswrapper[4816]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 11 11:59:57 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 11:59:57 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.446861 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 11:59:57 crc kubenswrapper[4816]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 11 11:59:57 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 11 11:59:57 crc kubenswrapper[4816]: set -o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: source "/env/_master" Mar 11 11:59:57 crc kubenswrapper[4816]: set +o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: fi Mar 11 11:59:57 crc kubenswrapper[4816]: Mar 11 11:59:57 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 11 11:59:57 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 11 11:59:57 crc kubenswrapper[4816]: --disable-webhook \ Mar 11 11:59:57 crc kubenswrapper[4816]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 11 11:59:57 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 11:59:57 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.448077 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 11 11:59:57 crc kubenswrapper[4816]: W0311 11:59:57.448651 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-92a1a9e28f924bb0f7181cc7ecce3df3c84d39c2f581ac287f4f874561bc443c WatchSource:0}: Error finding container 92a1a9e28f924bb0f7181cc7ecce3df3c84d39c2f581ac287f4f874561bc443c: Status 404 returned error can't find the container with id 92a1a9e28f924bb0f7181cc7ecce3df3c84d39c2f581ac287f4f874561bc443c Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.450584 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.452272 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.459919 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6ff8b55d8351f8419526fa7b016e8b28378459404a831b8bc005d60f9b1785d3"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.460897 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"92a1a9e28f924bb0f7181cc7ecce3df3c84d39c2f581ac287f4f874561bc443c"} Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.461103 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 11:59:57 crc kubenswrapper[4816]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 11 11:59:57 crc kubenswrapper[4816]: set -o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 11 11:59:57 crc kubenswrapper[4816]: source /etc/kubernetes/apiserver-url.env Mar 11 11:59:57 crc kubenswrapper[4816]: else Mar 11 11:59:57 crc kubenswrapper[4816]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 11 11:59:57 crc kubenswrapper[4816]: exit 1 Mar 11 11:59:57 crc kubenswrapper[4816]: fi Mar 11 11:59:57 crc kubenswrapper[4816]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 11 11:59:57 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 11:59:57 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.461800 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.462213 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.463295 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.464132 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0ae7d384c6c6aee635dc8d8b14f1b73fb77069547d903b12453a02c7ef21aa57"} Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.466327 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 11:59:57 crc kubenswrapper[4816]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 11 11:59:57 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 11 11:59:57 crc kubenswrapper[4816]: set -o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: source "/env/_master" Mar 11 11:59:57 crc kubenswrapper[4816]: set +o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: fi Mar 11 11:59:57 crc kubenswrapper[4816]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 11 11:59:57 crc kubenswrapper[4816]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 11 11:59:57 crc kubenswrapper[4816]: ho_enable="--enable-hybrid-overlay" Mar 11 11:59:57 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 11 11:59:57 crc kubenswrapper[4816]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 11 11:59:57 crc kubenswrapper[4816]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 11 11:59:57 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 11 11:59:57 crc kubenswrapper[4816]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --webhook-host=127.0.0.1 \ Mar 11 11:59:57 crc kubenswrapper[4816]: --webhook-port=9743 \ Mar 11 11:59:57 crc kubenswrapper[4816]: ${ho_enable} \ Mar 11 11:59:57 crc kubenswrapper[4816]: --enable-interconnect \ Mar 11 11:59:57 crc kubenswrapper[4816]: --disable-approver \ Mar 11 11:59:57 crc kubenswrapper[4816]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --wait-for-kubernetes-api=200s \ Mar 11 11:59:57 crc kubenswrapper[4816]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 11 11:59:57 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 11:59:57 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.468116 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 11:59:57 crc kubenswrapper[4816]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 11 11:59:57 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 11 11:59:57 crc kubenswrapper[4816]: set -o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: source "/env/_master" Mar 11 11:59:57 crc kubenswrapper[4816]: set +o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: fi Mar 11 11:59:57 crc kubenswrapper[4816]: Mar 11 11:59:57 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 11 11:59:57 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 11 11:59:57 crc kubenswrapper[4816]: --disable-webhook \ Mar 11 11:59:57 crc kubenswrapper[4816]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 11 11:59:57 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 11:59:57 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.469239 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.473760 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.475603 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.475658 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.475677 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.475704 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.475724 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.484461 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.494523 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.506027 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.520192 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.528570 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.537506 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.544777 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.552918 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.561567 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.568809 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.576293 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.577497 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.577519 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.577527 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.577540 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.577566 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.679516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.679550 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.679558 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.679571 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.679582 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.701343 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.701427 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.701521 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.701604 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 11:59:58.701572465 +0000 UTC m=+85.292836442 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.701674 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:58.701659957 +0000 UTC m=+85.292923924 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.782220 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.782280 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.782289 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.782301 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.782310 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.802768 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.802828 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.802849 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.802926 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.802996 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:58.802983127 +0000 UTC m=+85.394247084 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.802946 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.803053 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.803063 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.803115 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.803179 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.803205 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.803112 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:58.8030947 +0000 UTC m=+85.394358667 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.803349 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:58.803315756 +0000 UTC m=+85.394579763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.884599 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.884643 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.884651 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.884664 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.884672 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.987379 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.987423 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.987435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.987452 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.987466 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.090412 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.090470 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.090491 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.090559 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.090577 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.134798 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.135396 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.136757 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.137359 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.138264 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.138721 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.139329 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.140246 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.141023 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.141990 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.142467 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.143643 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.144181 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.144757 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.145625 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.146110 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.147043 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.147447 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.147980 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.149068 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.149506 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.150411 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.150868 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.151870 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.152359 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.152917 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.153927 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.154386 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.155316 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.155726 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.156529 4816 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.156624 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.158213 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.159053 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.159451 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.160847 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.161592 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.162394 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.162973 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.163944 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.164384 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.165285 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.165839 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.166811 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.167270 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.168091 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.168591 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.169611 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.170071 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.170952 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.171415 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.172218 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.172751 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.173176 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.192537 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.192677 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.192758 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.192860 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.192946 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.296145 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.296395 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.296510 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.296604 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.296680 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.399473 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.399523 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.399535 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.399555 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.399569 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.502549 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.502586 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.502597 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.502613 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.502624 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.606453 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.606522 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.606551 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.606581 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.606602 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.710104 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.710165 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.710186 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.710213 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.710234 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.713375 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.713521 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:00.71349412 +0000 UTC m=+87.304758127 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.713577 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.713742 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.713809 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:00.713795079 +0000 UTC m=+87.305059076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.813041 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.813116 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.813153 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.813185 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.813212 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.814039 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.814131 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.814187 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814330 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814352 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814361 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814394 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814417 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814455 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:00.814427728 +0000 UTC m=+87.405691735 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814373 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814493 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:00.81446882 +0000 UTC m=+87.405732817 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814500 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814582 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:00.814562312 +0000 UTC m=+87.405826319 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.915643 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.915693 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.915709 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.915731 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.915749 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.018218 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.018330 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.018354 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.018386 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.018409 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.120703 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.120742 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.120752 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.120769 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.120779 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.129929 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.130041 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.129994 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 11:59:59 crc kubenswrapper[4816]: E0311 11:59:59.130598 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 11:59:59 crc kubenswrapper[4816]: E0311 11:59:59.130868 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 11:59:59 crc kubenswrapper[4816]: E0311 11:59:59.130971 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.153162 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.154582 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.158799 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 11:59:59 crc kubenswrapper[4816]: E0311 11:59:59.160175 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.223058 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.223141 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.223166 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.223196 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.223219 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.326479 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.326518 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.326530 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.326547 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.326558 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.429647 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.429715 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.429731 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.429754 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.429772 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.469865 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 11:59:59 crc kubenswrapper[4816]: E0311 11:59:59.470067 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.531577 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.531658 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.531680 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.531711 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.531735 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.634164 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.634472 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.634559 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.634632 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.634695 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.738433 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.738481 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.738495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.738516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.738531 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.841399 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.841483 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.841508 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.841543 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.841568 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.944490 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.944532 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.944544 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.944562 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.944573 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.047768 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.047884 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.047906 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.047930 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.047948 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.151561 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.151619 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.151636 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.151661 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.151677 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.258799 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.258874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.258888 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.258905 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.259442 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.362418 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.362458 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.362471 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.362528 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.362542 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.464796 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.464844 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.464861 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.464879 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.464889 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.568226 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.568313 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.568330 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.568353 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.568375 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.671178 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.671279 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.671299 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.671334 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.671354 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.735552 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.735673 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.735850 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.735969 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:04.735924319 +0000 UTC m=+91.327188326 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.736061 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:04.736036852 +0000 UTC m=+91.327300859 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.773709 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.773754 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.773764 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.773780 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.773794 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.838653 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.838784 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.838853 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.838988 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839028 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839033 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839059 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839087 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839087 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839107 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:04.839080171 +0000 UTC m=+91.430344248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839122 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839158 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:04.839135462 +0000 UTC m=+91.430399459 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839195 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:04.839178513 +0000 UTC m=+91.430442620 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.877852 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.877909 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.877918 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.877934 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.877943 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.979919 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.979993 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.980016 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.980063 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.980091 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.082836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.082898 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.082917 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.082944 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.082962 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.129966 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.130046 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:01 crc kubenswrapper[4816]: E0311 12:00:01.130113 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:01 crc kubenswrapper[4816]: E0311 12:00:01.130237 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.130343 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:01 crc kubenswrapper[4816]: E0311 12:00:01.130542 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.185907 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.185965 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.185990 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.186018 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.186037 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.288906 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.288968 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.288987 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.289014 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.289033 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.392394 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.392463 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.392481 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.392505 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.392523 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.494361 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.494398 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.494408 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.494424 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.494435 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.596826 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.596857 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.596869 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.596884 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.596896 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.700368 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.700438 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.700466 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.700493 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.700513 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.803734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.803778 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.803787 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.803802 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.803811 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.906610 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.906688 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.906712 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.906743 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.906763 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.009321 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.009393 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.009415 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.009441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.009460 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.111621 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.111662 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.111676 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.111692 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.111703 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.214517 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.214555 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.214564 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.214581 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.214590 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.317355 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.317393 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.317402 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.317415 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.317426 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.419108 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.419166 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.419183 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.419206 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.419226 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.521153 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.521220 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.521240 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.521294 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.521314 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.624333 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.624393 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.624410 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.624438 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.624456 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.727490 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.727548 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.727556 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.727569 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.727636 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.831015 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.831095 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.831120 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.831152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.831173 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.860191 4816 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.933797 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.933859 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.933876 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.933899 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.933916 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.036651 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.036686 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.036694 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.036709 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.036747 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.130333 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.130410 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.130411 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:03 crc kubenswrapper[4816]: E0311 12:00:03.130660 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:03 crc kubenswrapper[4816]: E0311 12:00:03.130772 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:03 crc kubenswrapper[4816]: E0311 12:00:03.130834 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.139478 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.139536 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.139556 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.139580 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.139597 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.142053 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.241513 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.241559 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.241576 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.241598 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.241614 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.344331 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.344389 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.344412 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.344441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.344464 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.447216 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.447333 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.447977 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.448023 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.448040 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.549848 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.549891 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.549901 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.549916 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.549929 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.652661 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.652705 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.652722 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.652747 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.652763 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.755434 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.755509 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.755532 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.755560 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.755581 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.858691 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.858743 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.858768 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.858796 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.858817 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.962789 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.962852 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.962869 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.962897 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.962914 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.066020 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.066080 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.066106 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.066137 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.066159 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.144483 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.144588 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.144608 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.144637 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.144656 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.150861 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.160673 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.166741 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.166814 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.166830 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.166855 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.167060 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.168343 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.180984 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.184102 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.185414 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.185480 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.185503 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.185531 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.185550 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.202297 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.202466 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.206835 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.206877 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.206888 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.206906 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.206920 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.220417 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.221642 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.226520 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.226583 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.226602 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.226633 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.226650 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.234423 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.249764 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.250216 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.251945 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.251982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.251991 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.252006 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.252015 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.262448 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.274984 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.285243 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.355136 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.355177 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.355187 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.355204 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.355218 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.457566 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.457598 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.457607 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.457622 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.457631 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.559974 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.560005 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.560015 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.560029 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.560038 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.663498 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.663561 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.663580 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.663609 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.663632 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.766453 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.766491 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.766499 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.766514 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.766523 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.778143 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.778305 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.778477 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:12.77842292 +0000 UTC m=+99.369686927 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.778535 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.778682 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:12.778647766 +0000 UTC m=+99.369911773 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.870845 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.870925 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.870949 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.870982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.871002 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.879397 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.879483 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.879522 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.879650 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.879728 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:12.879702508 +0000 UTC m=+99.470966515 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.879876 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.879924 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.879943 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.879990 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:12.879975575 +0000 UTC m=+99.471239572 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.880485 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.880534 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.880556 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.880625 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:12.880605573 +0000 UTC m=+99.471869570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.975145 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.976146 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.976421 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.977063 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.977118 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.080104 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.080158 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.080168 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.080183 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.080192 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.129610 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:05 crc kubenswrapper[4816]: E0311 12:00:05.129760 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.129820 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.129991 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:05 crc kubenswrapper[4816]: E0311 12:00:05.130005 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:05 crc kubenswrapper[4816]: E0311 12:00:05.130200 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.182334 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.182379 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.182391 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.182457 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.182471 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.285215 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.285302 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.285320 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.285347 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.285364 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.387711 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.387766 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.387774 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.387786 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.387796 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.489237 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.489292 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.489303 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.489318 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.489330 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.591673 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.591726 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.591738 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.591756 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.591769 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.693583 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.693621 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.693630 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.693643 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.693656 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.795613 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.795663 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.795676 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.795692 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.795703 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.897627 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.897671 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.897681 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.897702 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.897713 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.999903 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.999944 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.999955 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.999969 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.999979 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.102317 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.102791 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.102934 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.103057 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.103174 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.205968 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.206019 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.206030 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.206047 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.206061 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.309351 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.309398 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.309408 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.309424 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.309438 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.412545 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.412585 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.412595 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.412609 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.412619 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.515238 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.515323 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.515334 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.515346 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.515355 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.525101 4816 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.618004 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.618041 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.618053 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.618069 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.618080 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.721111 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.721180 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.721205 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.721235 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.721306 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.823366 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.823406 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.823415 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.823428 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.823438 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.926932 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.926970 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.926980 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.927012 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.927023 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.029586 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.029663 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.029681 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.029705 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.029726 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.129720 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.129767 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.129739 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:07 crc kubenswrapper[4816]: E0311 12:00:07.129852 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:07 crc kubenswrapper[4816]: E0311 12:00:07.130062 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:07 crc kubenswrapper[4816]: E0311 12:00:07.130110 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.132673 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.132713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.132726 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.132743 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.132755 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.235833 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.235902 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.235926 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.235956 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.235976 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.339214 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.339308 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.339328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.339839 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.339899 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.442402 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.442624 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.442767 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.442856 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.442934 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.546370 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.546483 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.546549 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.546574 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.546596 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.649935 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.650009 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.650034 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.650069 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.650095 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.753276 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.753355 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.753378 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.753409 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.753431 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.856402 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.856435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.856444 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.856458 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.856468 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.958536 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.958569 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.958579 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.958591 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.958600 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.061078 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.061108 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.061117 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.061152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.061166 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.163602 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.163665 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.163688 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.163716 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.163737 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.266914 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.266962 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.266972 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.266987 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.266997 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.370039 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.370103 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.370122 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.370150 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.370175 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.472901 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.473048 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.473082 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.473118 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.473144 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.576152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.576215 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.576233 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.576290 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.576314 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.678697 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.678737 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.678747 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.678760 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.678769 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.782496 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.782548 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.782562 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.782579 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.782590 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.811778 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-x2vtk"] Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.812438 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.814282 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.814283 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.815195 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.830446 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.841890 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.851367 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.865860 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.885539 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.885742 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.885968 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.885983 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.886005 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.886017 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.899736 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.911963 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.924405 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.932747 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5szhb\" (UniqueName: \"kubernetes.io/projected/6497a90c-3b50-4dba-80d3-085c57f4f567-kube-api-access-5szhb\") pod \"node-resolver-x2vtk\" (UID: \"6497a90c-3b50-4dba-80d3-085c57f4f567\") " pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.932857 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6497a90c-3b50-4dba-80d3-085c57f4f567-hosts-file\") pod \"node-resolver-x2vtk\" (UID: \"6497a90c-3b50-4dba-80d3-085c57f4f567\") " pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.933916 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.955490 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.989198 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.989349 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.989371 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.989405 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.989428 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.033699 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6497a90c-3b50-4dba-80d3-085c57f4f567-hosts-file\") pod \"node-resolver-x2vtk\" (UID: \"6497a90c-3b50-4dba-80d3-085c57f4f567\") " pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.033780 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5szhb\" (UniqueName: \"kubernetes.io/projected/6497a90c-3b50-4dba-80d3-085c57f4f567-kube-api-access-5szhb\") pod \"node-resolver-x2vtk\" (UID: \"6497a90c-3b50-4dba-80d3-085c57f4f567\") " pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.034098 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6497a90c-3b50-4dba-80d3-085c57f4f567-hosts-file\") pod \"node-resolver-x2vtk\" (UID: \"6497a90c-3b50-4dba-80d3-085c57f4f567\") " pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.053874 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5szhb\" (UniqueName: \"kubernetes.io/projected/6497a90c-3b50-4dba-80d3-085c57f4f567-kube-api-access-5szhb\") pod \"node-resolver-x2vtk\" (UID: \"6497a90c-3b50-4dba-80d3-085c57f4f567\") " pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.092737 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.092802 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.092819 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.092842 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.092860 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.129895 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:09 crc kubenswrapper[4816]: E0311 12:00:09.130135 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.130191 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.130390 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:09 crc kubenswrapper[4816]: E0311 12:00:09.131532 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:09 crc kubenswrapper[4816]: E0311 12:00:09.131713 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.137138 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.173160 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-mdbt5"] Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.173513 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-zbg7x"] Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.173739 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.179760 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.180028 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.180207 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.181353 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.181680 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.182270 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-b4v82"] Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.182763 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.182842 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.186018 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.188762 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.188830 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.189489 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.189673 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.189839 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.191239 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.193700 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.195937 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.195983 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.195998 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.196020 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.196035 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.209029 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.223836 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.242486 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.252683 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.262653 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.275143 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.286595 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.296174 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.310581 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.310621 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.310639 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.310654 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.310663 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.318883 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.335958 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336010 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7fdff21c-644f-4443-a268-f98c91ea120a-proxy-tls\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336030 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-os-release\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336046 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-etc-kubernetes\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336062 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cni-binary-copy\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336224 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7fdff21c-644f-4443-a268-f98c91ea120a-mcd-auth-proxy-config\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336314 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-system-cni-dir\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336336 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd5cm\" (UniqueName: \"kubernetes.io/projected/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-kube-api-access-gd5cm\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336374 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-system-cni-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336408 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-socket-dir-parent\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336434 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-netns\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336467 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7fdff21c-644f-4443-a268-f98c91ea120a-rootfs\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336502 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-kubelet\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336559 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-conf-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336638 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-hostroot\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336678 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-cni-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336697 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a30d3e88-e081-4303-a202-1b7505629539-cni-binary-copy\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336715 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a30d3e88-e081-4303-a202-1b7505629539-multus-daemon-config\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336732 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-os-release\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336832 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336856 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-cni-bin\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336904 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5sxg\" (UniqueName: \"kubernetes.io/projected/a30d3e88-e081-4303-a202-1b7505629539-kube-api-access-q5sxg\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336936 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-cnibin\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336962 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-cni-multus\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.337003 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-k8s-cni-cncf-io\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.337032 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-multus-certs\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.337107 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqdgk\" (UniqueName: \"kubernetes.io/projected/7fdff21c-644f-4443-a268-f98c91ea120a-kube-api-access-jqdgk\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.337137 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cnibin\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.339256 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.347978 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.360685 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.371143 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.381751 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.391516 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.402043 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zbg7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.412294 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.416303 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.416349 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.416360 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.416382 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.416403 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.425500 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438418 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cnibin\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438464 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438483 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7fdff21c-644f-4443-a268-f98c91ea120a-proxy-tls\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438503 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-os-release\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438520 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-etc-kubernetes\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438538 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cni-binary-copy\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438539 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cnibin\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438652 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7fdff21c-644f-4443-a268-f98c91ea120a-mcd-auth-proxy-config\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438688 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-system-cni-dir\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438709 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd5cm\" (UniqueName: \"kubernetes.io/projected/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-kube-api-access-gd5cm\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438741 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-system-cni-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438760 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-socket-dir-parent\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438779 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-netns\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438805 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7fdff21c-644f-4443-a268-f98c91ea120a-rootfs\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438824 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-kubelet\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438858 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-conf-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438879 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-hostroot\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438896 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-cni-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438915 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a30d3e88-e081-4303-a202-1b7505629539-cni-binary-copy\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438937 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a30d3e88-e081-4303-a202-1b7505629539-multus-daemon-config\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438962 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-os-release\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438988 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439015 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-cni-bin\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439039 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5sxg\" (UniqueName: \"kubernetes.io/projected/a30d3e88-e081-4303-a202-1b7505629539-kube-api-access-q5sxg\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439064 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-cnibin\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439086 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-cni-multus\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439105 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-k8s-cni-cncf-io\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439128 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-multus-certs\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439157 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqdgk\" (UniqueName: \"kubernetes.io/projected/7fdff21c-644f-4443-a268-f98c91ea120a-kube-api-access-jqdgk\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439226 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cni-binary-copy\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439855 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440047 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-cni-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440132 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-cni-bin\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440699 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-cnibin\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440735 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7fdff21c-644f-4443-a268-f98c91ea120a-mcd-auth-proxy-config\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440750 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-cni-multus\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440784 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-k8s-cni-cncf-io\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440791 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-system-cni-dir\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440810 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-multus-certs\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440863 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-os-release\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440941 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a30d3e88-e081-4303-a202-1b7505629539-cni-binary-copy\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441084 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-system-cni-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441143 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-socket-dir-parent\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441180 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-netns\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441217 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7fdff21c-644f-4443-a268-f98c91ea120a-rootfs\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441295 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-kubelet\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441335 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-conf-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441367 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-hostroot\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441425 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-os-release\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441461 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-etc-kubernetes\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441574 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441601 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a30d3e88-e081-4303-a202-1b7505629539-multus-daemon-config\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.447613 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.451271 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7fdff21c-644f-4443-a268-f98c91ea120a-proxy-tls\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.458687 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.460560 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5sxg\" (UniqueName: \"kubernetes.io/projected/a30d3e88-e081-4303-a202-1b7505629539-kube-api-access-q5sxg\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.461566 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqdgk\" (UniqueName: \"kubernetes.io/projected/7fdff21c-644f-4443-a268-f98c91ea120a-kube-api-access-jqdgk\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.465436 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd5cm\" (UniqueName: \"kubernetes.io/projected/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-kube-api-access-gd5cm\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.469315 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.477645 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.486262 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fdff21c-644f-4443-a268-f98c91ea120a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b4v82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.507571 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.514138 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.519152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.519308 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.519413 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.519516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.519600 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.519930 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x2vtk" event={"ID":"6497a90c-3b50-4dba-80d3-085c57f4f567","Type":"ContainerStarted","Data":"9c1d5c4f57d4820ed42117939aca9f75eaece467e386f188a336edeb0c931401"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.520027 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x2vtk" event={"ID":"6497a90c-3b50-4dba-80d3-085c57f4f567","Type":"ContainerStarted","Data":"cad11bdd7e68667f7df7a431510b94ff411cd6b51b86a25038a8c406f07c96e3"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.524087 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"611e39ed4e5d7fcae87f81c718ae7237dfb72a021a40c1fe2df5131b6045a550"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.524578 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: W0311 12:00:09.527310 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda30d3e88_e081_4303_a202_1b7505629539.slice/crio-c5a149483c2fd203a95d61febf5181396381d452588438b940fdacc6f89c4e91 WatchSource:0}: Error finding container c5a149483c2fd203a95d61febf5181396381d452588438b940fdacc6f89c4e91: Status 404 returned error can't find the container with id c5a149483c2fd203a95d61febf5181396381d452588438b940fdacc6f89c4e91 Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.530365 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.536101 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dkh2h"] Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.537949 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: W0311 12:00:09.538786 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fdff21c_644f_4443_a268_f98c91ea120a.slice/crio-2d58cf54804a0cc94bf9cfdd8ca1ca7961e514eacea1a5fd5d725dd7aeea2d32 WatchSource:0}: Error finding container 2d58cf54804a0cc94bf9cfdd8ca1ca7961e514eacea1a5fd5d725dd7aeea2d32: Status 404 returned error can't find the container with id 2d58cf54804a0cc94bf9cfdd8ca1ca7961e514eacea1a5fd5d725dd7aeea2d32 Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.540328 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.540481 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.540742 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.540779 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.542485 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.542658 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.544430 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.546764 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.559985 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.576861 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.591508 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zbg7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.602229 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.614184 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.623072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.623105 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.623121 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.623144 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.623160 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.630602 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640080 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-systemd-units\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640134 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-node-log\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640165 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-ovn-kubernetes\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640198 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovn-node-metrics-cert\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640294 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-kubelet\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640323 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-log-socket\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640349 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-slash\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640378 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640423 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-bin\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640453 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj5rk\" (UniqueName: \"kubernetes.io/projected/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-kube-api-access-dj5rk\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640560 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-systemd\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640636 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-netd\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640670 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-config\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640704 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-env-overrides\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640740 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-netns\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640774 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-var-lib-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640825 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-etc-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640859 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-script-lib\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640888 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640917 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-ovn\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.641886 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1d5c4f57d4820ed42117939aca9f75eaece467e386f188a336edeb0c931401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.652933 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fdff21c-644f-4443-a268-f98c91ea120a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b4v82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.671695 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.685446 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.699793 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.712353 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.721105 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1d5c4f57d4820ed42117939aca9f75eaece467e386f188a336edeb0c931401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.726495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.726520 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.726530 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.726546 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.726560 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.732570 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fdff21c-644f-4443-a268-f98c91ea120a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b4v82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742175 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-systemd\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742207 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-netd\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742230 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-netns\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742271 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-var-lib-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742292 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-config\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742310 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-env-overrides\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742333 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-etc-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742354 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-script-lib\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742382 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742425 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-ovn\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742449 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-systemd-units\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742471 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-node-log\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742489 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-ovn-kubernetes\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742509 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovn-node-metrics-cert\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742533 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-kubelet\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742550 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-log-socket\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742577 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-slash\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742596 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742618 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-bin\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742641 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj5rk\" (UniqueName: \"kubernetes.io/projected/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-kube-api-access-dj5rk\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742903 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-systemd\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742941 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-netd\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742963 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-netns\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742987 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-var-lib-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.743626 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-config\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.743959 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-env-overrides\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.744001 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-etc-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.744456 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-script-lib\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.744503 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.744531 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-ovn\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.744554 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-systemd-units\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.744577 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-node-log\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.744612 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-ovn-kubernetes\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.745344 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-slash\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.745383 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-kubelet\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.745415 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-log-socket\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.745441 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.745465 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-bin\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.748891 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovn-node-metrics-cert\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.759179 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.764904 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj5rk\" (UniqueName: \"kubernetes.io/projected/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-kube-api-access-dj5rk\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.775392 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611e39ed4e5d7fcae87f81c718ae7237dfb72a021a40c1fe2df5131b6045a550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.786756 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.796746 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.809859 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.825196 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.829230 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.829315 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.829330 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.829359 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.829376 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.838629 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.853405 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zbg7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.859494 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.873161 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dkh2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: W0311 12:00:09.877532 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fbe3bb6_8bf9_40b5_8f4f_0d136e285528.slice/crio-062d70cdf9dcd40a3c2ebd1f383f192eaa42464d705740bb35123cc3c8899d9b WatchSource:0}: Error finding container 062d70cdf9dcd40a3c2ebd1f383f192eaa42464d705740bb35123cc3c8899d9b: Status 404 returned error can't find the container with id 062d70cdf9dcd40a3c2ebd1f383f192eaa42464d705740bb35123cc3c8899d9b Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.883947 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.899344 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.932549 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.932614 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.932629 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.932651 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.933119 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.035869 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.035904 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.035912 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.035925 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.035935 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.142387 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.142447 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.142464 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.142485 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.142506 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.245593 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.245633 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.245642 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.245656 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.245665 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.348694 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.348769 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.348789 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.348824 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.348847 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.452342 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.452621 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.452843 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.452991 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.453130 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.530039 4816 generic.go:334] "Generic (PLEG): container finished" podID="020fe9c8-a66d-450b-b7b3-b83bcd2bf552" containerID="d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c" exitCode=0 Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.530431 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerDied","Data":"d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.530688 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerStarted","Data":"354b5b811c9401fe1d22e136dc2cd35d028058f62d6d104e3fb9da21027e38f6"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.532472 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mdbt5" event={"ID":"a30d3e88-e081-4303-a202-1b7505629539","Type":"ContainerStarted","Data":"cd735f189f4c35270aee2182d3a2eecb596607b294cc97d17559e7b5727e8dba"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.532521 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mdbt5" event={"ID":"a30d3e88-e081-4303-a202-1b7505629539","Type":"ContainerStarted","Data":"c5a149483c2fd203a95d61febf5181396381d452588438b940fdacc6f89c4e91"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.535623 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5" exitCode=0 Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.535697 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.535728 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"062d70cdf9dcd40a3c2ebd1f383f192eaa42464d705740bb35123cc3c8899d9b"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.538060 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"fdd18d04896447f4bc152e9c4aaaaefe467b16481b593ffa86a7ed44a9120a06"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.538103 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.538116 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"2d58cf54804a0cc94bf9cfdd8ca1ca7961e514eacea1a5fd5d725dd7aeea2d32"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.547013 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.555522 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.555562 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.555573 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.555591 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.555603 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.562966 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.576074 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zbg7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.628952 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dkh2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.643629 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.661487 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.661530 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.661542 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.661560 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.661574 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.662785 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.680677 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.698695 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1d5c4f57d4820ed42117939aca9f75eaece467e386f188a336edeb0c931401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.712324 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fdff21c-644f-4443-a268-f98c91ea120a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b4v82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.733570 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.747687 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611e39ed4e5d7fcae87f81c718ae7237dfb72a021a40c1fe2df5131b6045a550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.759819 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.764143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.764204 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.764220 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.764268 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.764284 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.767970 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.780686 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.790704 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.808434 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.820373 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.830225 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.840452 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.858552 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zbg7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.866814 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.866853 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.866887 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.866904 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.866915 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.868831 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.881792 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd735f189f4c35270aee2182d3a2eecb596607b294cc97d17559e7b5727e8dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.902494 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dkh2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.923459 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.956167 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611e39ed4e5d7fcae87f81c718ae7237dfb72a021a40c1fe2df5131b6045a550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.968946 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.968981 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.968995 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.969010 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.969021 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.980599 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.994028 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1d5c4f57d4820ed42117939aca9f75eaece467e386f188a336edeb0c931401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.009558 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fdff21c-644f-4443-a268-f98c91ea120a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd18d04896447f4bc152e9c4aaaaefe467b16481b593ffa86a7ed44a9120a06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b4v82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.072850 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.072900 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.072913 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.072932 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.072942 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.129881 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.129946 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.129973 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:11 crc kubenswrapper[4816]: E0311 12:00:11.131018 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.131323 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 12:00:11 crc kubenswrapper[4816]: E0311 12:00:11.131444 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:11 crc kubenswrapper[4816]: E0311 12:00:11.131504 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:11 crc kubenswrapper[4816]: E0311 12:00:11.131603 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.176094 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.176162 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.176179 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.176205 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.176222 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.279709 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.279778 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.279787 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.279802 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.279830 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.382472 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.382524 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.382535 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.382555 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.382567 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.485692 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.485738 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.485747 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.485768 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.485782 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.558717 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerStarted","Data":"9783b2ea2ca98c9bef532d998d57e59ecf703e439d61a21baefc82ccd5937a56"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.561948 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.561995 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.562007 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.562017 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.567080 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"03089bdc2168c6be2aaace4e060f2b5242a0fd983ee808f6e61f5d7722767c13"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.567133 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4241b51ce8ec4c60ab9d9911594f165da55eead854b434a39dd6fd18002ba112"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.578695 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.588428 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.588465 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.588475 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.588495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.588508 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.593810 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd735f189f4c35270aee2182d3a2eecb596607b294cc97d17559e7b5727e8dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.609954 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dkh2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.619605 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1d5c4f57d4820ed42117939aca9f75eaece467e386f188a336edeb0c931401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.627894 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fdff21c-644f-4443-a268-f98c91ea120a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd18d04896447f4bc152e9c4aaaaefe467b16481b593ffa86a7ed44a9120a06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b4v82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.645743 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.657000 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611e39ed4e5d7fcae87f81c718ae7237dfb72a021a40c1fe2df5131b6045a550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.667548 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.676395 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.687329 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.690882 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.690940 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.690952 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.690978 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.690993 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.701259 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.712785 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.735270 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9783b2ea2ca98c9bef532d998d57e59ecf703e439d61a21baefc82ccd5937a56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zbg7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.748917 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.756500 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.769142 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.779449 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03089bdc2168c6be2aaace4e060f2b5242a0fd983ee808f6e61f5d7722767c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4241b51ce8ec4c60ab9d9911594f165da55eead854b434a39dd6fd18002ba112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.787642 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.795807 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.795927 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.796009 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.796077 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.796159 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.797138 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.807480 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9783b2ea2ca98c9bef532d998d57e59ecf703e439d61a21baefc82ccd5937a56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zbg7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.816848 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.828098 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd735f189f4c35270aee2182d3a2eecb596607b294cc97d17559e7b5727e8dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.845398 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dkh2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.899386 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.899428 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.899439 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.899461 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.899472 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.923223 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=12.923206298 podStartE2EDuration="12.923206298s" podCreationTimestamp="2026-03-11 11:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:11.920229993 +0000 UTC m=+98.511493960" watchObservedRunningTime="2026-03-11 12:00:11.923206298 +0000 UTC m=+98.514470265" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.002809 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.002862 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.002877 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.002897 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.002911 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.044553 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podStartSLOduration=52.044520909 podStartE2EDuration="52.044520909s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:12.044364815 +0000 UTC m=+98.635628792" watchObservedRunningTime="2026-03-11 12:00:12.044520909 +0000 UTC m=+98.635784876" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.044751 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-x2vtk" podStartSLOduration=52.044747316 podStartE2EDuration="52.044747316s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:12.005130292 +0000 UTC m=+98.596394299" watchObservedRunningTime="2026-03-11 12:00:12.044747316 +0000 UTC m=+98.636011283" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.105814 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.105882 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.105894 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.105915 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.105927 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.209477 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.209541 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.209551 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.209569 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.209580 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.222275 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bwrxd"] Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.223282 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.226089 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.226146 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.226602 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.226616 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.281116 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mdbt5" podStartSLOduration=52.281095368 podStartE2EDuration="52.281095368s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:12.280514022 +0000 UTC m=+98.871778009" watchObservedRunningTime="2026-03-11 12:00:12.281095368 +0000 UTC m=+98.872359345" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.312324 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.312352 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.312363 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.312378 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.312389 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.368896 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=9.36887507 podStartE2EDuration="9.36887507s" podCreationTimestamp="2026-03-11 12:00:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:12.32135751 +0000 UTC m=+98.912621487" watchObservedRunningTime="2026-03-11 12:00:12.36887507 +0000 UTC m=+98.960139057" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.373807 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrt84\" (UniqueName: \"kubernetes.io/projected/6d97dc61-2b0e-413c-942a-b86cb01f20a1-kube-api-access-xrt84\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.373875 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d97dc61-2b0e-413c-942a-b86cb01f20a1-host\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.373905 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6d97dc61-2b0e-413c-942a-b86cb01f20a1-serviceca\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.415361 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.415420 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.415430 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.415451 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.415466 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.425679 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655"] Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.426147 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.448592 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-tt4rv"] Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.449280 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.449379 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tt4rv" podUID="91b59d67-b771-4a57-b2a8-84303ec4d9bd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.454764 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.474782 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.475208 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrt84\" (UniqueName: \"kubernetes.io/projected/6d97dc61-2b0e-413c-942a-b86cb01f20a1-kube-api-access-xrt84\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.475360 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d97dc61-2b0e-413c-942a-b86cb01f20a1-host\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.475444 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6d97dc61-2b0e-413c-942a-b86cb01f20a1-serviceca\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.475673 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d97dc61-2b0e-413c-942a-b86cb01f20a1-host\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.476742 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6d97dc61-2b0e-413c-942a-b86cb01f20a1-serviceca\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.517997 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.518305 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.518393 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.518541 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.518628 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.546784 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrt84\" (UniqueName: \"kubernetes.io/projected/6d97dc61-2b0e-413c-942a-b86cb01f20a1-kube-api-access-xrt84\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.575148 4816 generic.go:334] "Generic (PLEG): container finished" podID="020fe9c8-a66d-450b-b7b3-b83bcd2bf552" containerID="9783b2ea2ca98c9bef532d998d57e59ecf703e439d61a21baefc82ccd5937a56" exitCode=0 Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.575298 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerDied","Data":"9783b2ea2ca98c9bef532d998d57e59ecf703e439d61a21baefc82ccd5937a56"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.576272 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09dd02d0-be8a-4c51-9dfd-d601d05cd866-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.576391 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.576515 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pmjp\" (UniqueName: \"kubernetes.io/projected/91b59d67-b771-4a57-b2a8-84303ec4d9bd-kube-api-access-8pmjp\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.576573 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09dd02d0-be8a-4c51-9dfd-d601d05cd866-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.576597 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8qh5\" (UniqueName: \"kubernetes.io/projected/09dd02d0-be8a-4c51-9dfd-d601d05cd866-kube-api-access-l8qh5\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.576632 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09dd02d0-be8a-4c51-9dfd-d601d05cd866-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.583801 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.583963 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.621786 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.621821 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.621829 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.621845 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.621857 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.677415 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pmjp\" (UniqueName: \"kubernetes.io/projected/91b59d67-b771-4a57-b2a8-84303ec4d9bd-kube-api-access-8pmjp\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.677466 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09dd02d0-be8a-4c51-9dfd-d601d05cd866-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.677483 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8qh5\" (UniqueName: \"kubernetes.io/projected/09dd02d0-be8a-4c51-9dfd-d601d05cd866-kube-api-access-l8qh5\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.677624 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09dd02d0-be8a-4c51-9dfd-d601d05cd866-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.677660 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09dd02d0-be8a-4c51-9dfd-d601d05cd866-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.677713 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.677845 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.677928 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs podName:91b59d67-b771-4a57-b2a8-84303ec4d9bd nodeName:}" failed. No retries permitted until 2026-03-11 12:00:13.177911383 +0000 UTC m=+99.769175350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs") pod "network-metrics-daemon-tt4rv" (UID: "91b59d67-b771-4a57-b2a8-84303ec4d9bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.678357 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09dd02d0-be8a-4c51-9dfd-d601d05cd866-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.679432 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09dd02d0-be8a-4c51-9dfd-d601d05cd866-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.724782 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.724836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.724848 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.724865 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.724875 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.730775 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8qh5\" (UniqueName: \"kubernetes.io/projected/09dd02d0-be8a-4c51-9dfd-d601d05cd866-kube-api-access-l8qh5\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.730832 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09dd02d0-be8a-4c51-9dfd-d601d05cd866-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.732578 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pmjp\" (UniqueName: \"kubernetes.io/projected/91b59d67-b771-4a57-b2a8-84303ec4d9bd-kube-api-access-8pmjp\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.738820 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.828359 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.828391 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.828399 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.828418 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.828431 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: W0311 12:00:12.831639 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09dd02d0_be8a_4c51_9dfd_d601d05cd866.slice/crio-b02f005abc394542d68788747b373edeeaca50785e5d5b93d39bbf1ce9bf3293 WatchSource:0}: Error finding container b02f005abc394542d68788747b373edeeaca50785e5d5b93d39bbf1ce9bf3293: Status 404 returned error can't find the container with id b02f005abc394542d68788747b373edeeaca50785e5d5b93d39bbf1ce9bf3293 Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.839182 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: W0311 12:00:12.873614 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d97dc61_2b0e_413c_942a_b86cb01f20a1.slice/crio-646d564ba5b1eb8a60d055c2aa0b71a965805d0f9bd8f69c9e444c3b7a9ecfdd WatchSource:0}: Error finding container 646d564ba5b1eb8a60d055c2aa0b71a965805d0f9bd8f69c9e444c3b7a9ecfdd: Status 404 returned error can't find the container with id 646d564ba5b1eb8a60d055c2aa0b71a965805d0f9bd8f69c9e444c3b7a9ecfdd Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.879089 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.879549 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.879513732 +0000 UTC m=+115.470777729 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.881456 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.881549 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.881637 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.881729 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882017 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882070 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882098 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882208 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.882183808 +0000 UTC m=+115.473447815 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882345 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882374 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882397 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.882385454 +0000 UTC m=+115.473649421 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882406 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882428 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882441 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882466 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.882458136 +0000 UTC m=+115.473722103 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882495 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.882471006 +0000 UTC m=+115.473735013 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.933372 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.933425 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.933440 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.933463 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.933480 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.038432 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.038510 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.038525 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.038596 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.038611 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.129609 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:13 crc kubenswrapper[4816]: E0311 12:00:13.130050 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.129707 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:13 crc kubenswrapper[4816]: E0311 12:00:13.130162 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.129624 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:13 crc kubenswrapper[4816]: E0311 12:00:13.130209 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.140985 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.141212 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.141333 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.141411 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.141498 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.184618 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:13 crc kubenswrapper[4816]: E0311 12:00:13.184803 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:13 crc kubenswrapper[4816]: E0311 12:00:13.184877 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs podName:91b59d67-b771-4a57-b2a8-84303ec4d9bd nodeName:}" failed. No retries permitted until 2026-03-11 12:00:14.184860679 +0000 UTC m=+100.776124646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs") pod "network-metrics-daemon-tt4rv" (UID: "91b59d67-b771-4a57-b2a8-84303ec4d9bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.245054 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.245100 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.245113 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.245130 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.245140 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.351535 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.351640 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.351672 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.351713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.351740 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.454446 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.454510 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.454533 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.454567 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.454586 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.557391 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.557423 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.557435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.557451 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.557462 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.589308 4816 generic.go:334] "Generic (PLEG): container finished" podID="020fe9c8-a66d-450b-b7b3-b83bcd2bf552" containerID="e4827fb1c91db3692da5430b5d9b64c1e1fb86fb9225c92506d9d7149ce77fd8" exitCode=0 Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.589372 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerDied","Data":"e4827fb1c91db3692da5430b5d9b64c1e1fb86fb9225c92506d9d7149ce77fd8"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.591667 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" event={"ID":"09dd02d0-be8a-4c51-9dfd-d601d05cd866","Type":"ContainerStarted","Data":"76cb1fef8f63512b1532dedbed8b375110bb0e659911c9a42366bb82778856cc"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.591719 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" event={"ID":"09dd02d0-be8a-4c51-9dfd-d601d05cd866","Type":"ContainerStarted","Data":"e59a45203ea1cde69f67b68fd8d17d80c31eb16d0008cfce0f5a04c13e1dc1b1"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.591731 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" event={"ID":"09dd02d0-be8a-4c51-9dfd-d601d05cd866","Type":"ContainerStarted","Data":"b02f005abc394542d68788747b373edeeaca50785e5d5b93d39bbf1ce9bf3293"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.595822 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ea705061b843fcacd713f916734a7471b25708babd9c3064fdb81c43ca0e292e"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.598486 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bwrxd" event={"ID":"6d97dc61-2b0e-413c-942a-b86cb01f20a1","Type":"ContainerStarted","Data":"e98ffb21cd666e1b8334dae28dfc28dfcea09c353ad005c552c43a056fa9bb35"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.598558 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bwrxd" event={"ID":"6d97dc61-2b0e-413c-942a-b86cb01f20a1","Type":"ContainerStarted","Data":"646d564ba5b1eb8a60d055c2aa0b71a965805d0f9bd8f69c9e444c3b7a9ecfdd"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.634505 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bwrxd" podStartSLOduration=53.634463514 podStartE2EDuration="53.634463514s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:13.633004872 +0000 UTC m=+100.224268839" watchObservedRunningTime="2026-03-11 12:00:13.634463514 +0000 UTC m=+100.225727511" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.661143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.661202 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.661219 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.661286 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.661308 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.768271 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.768309 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.768321 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.768337 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.768350 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.871027 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.871070 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.871080 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.871096 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.871107 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.974046 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.974092 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.974105 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.974120 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.974132 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.077403 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.077434 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.077445 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.077457 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.077467 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:14Z","lastTransitionTime":"2026-03-11T12:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.130472 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:14 crc kubenswrapper[4816]: E0311 12:00:14.130604 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tt4rv" podUID="91b59d67-b771-4a57-b2a8-84303ec4d9bd" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.180056 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.180091 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.180101 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.180114 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.180124 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:14Z","lastTransitionTime":"2026-03-11T12:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.217557 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:14 crc kubenswrapper[4816]: E0311 12:00:14.217698 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:14 crc kubenswrapper[4816]: E0311 12:00:14.217776 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs podName:91b59d67-b771-4a57-b2a8-84303ec4d9bd nodeName:}" failed. No retries permitted until 2026-03-11 12:00:16.217757083 +0000 UTC m=+102.809021060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs") pod "network-metrics-daemon-tt4rv" (UID: "91b59d67-b771-4a57-b2a8-84303ec4d9bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.283025 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.283117 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.283143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.283180 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.283207 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:14Z","lastTransitionTime":"2026-03-11T12:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.355401 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.355439 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.355450 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.355465 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.355478 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:14Z","lastTransitionTime":"2026-03-11T12:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.419750 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" podStartSLOduration=54.419732132 podStartE2EDuration="54.419732132s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:13.659458208 +0000 UTC m=+100.250722175" watchObservedRunningTime="2026-03-11 12:00:14.419732132 +0000 UTC m=+101.010996099" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.420591 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6"] Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.421123 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.424464 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.424596 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.426157 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.426344 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.520775 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f2e52741-8cc7-4b62-8b75-5cae7f35a099-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.520836 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e52741-8cc7-4b62-8b75-5cae7f35a099-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.520889 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2e52741-8cc7-4b62-8b75-5cae7f35a099-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.520963 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f2e52741-8cc7-4b62-8b75-5cae7f35a099-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.520996 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2e52741-8cc7-4b62-8b75-5cae7f35a099-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.606372 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.608932 4816 generic.go:334] "Generic (PLEG): container finished" podID="020fe9c8-a66d-450b-b7b3-b83bcd2bf552" containerID="ec01c00f45917fb96ae9b2d32ebde4f2cc28a9e248f785ef48ad05897d866083" exitCode=0 Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.609062 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerDied","Data":"ec01c00f45917fb96ae9b2d32ebde4f2cc28a9e248f785ef48ad05897d866083"} Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.621463 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2e52741-8cc7-4b62-8b75-5cae7f35a099-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.621542 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f2e52741-8cc7-4b62-8b75-5cae7f35a099-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.621568 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2e52741-8cc7-4b62-8b75-5cae7f35a099-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.621609 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f2e52741-8cc7-4b62-8b75-5cae7f35a099-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.621629 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e52741-8cc7-4b62-8b75-5cae7f35a099-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.621683 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f2e52741-8cc7-4b62-8b75-5cae7f35a099-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.621865 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f2e52741-8cc7-4b62-8b75-5cae7f35a099-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.622602 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2e52741-8cc7-4b62-8b75-5cae7f35a099-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.636096 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e52741-8cc7-4b62-8b75-5cae7f35a099-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.643867 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2e52741-8cc7-4b62-8b75-5cae7f35a099-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.738760 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: W0311 12:00:14.759666 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2e52741_8cc7_4b62_8b75_5cae7f35a099.slice/crio-14d86cbebce3b0c032e715d559da9c520c7f26d0f3fafd903553618ffa2334ab WatchSource:0}: Error finding container 14d86cbebce3b0c032e715d559da9c520c7f26d0f3fafd903553618ffa2334ab: Status 404 returned error can't find the container with id 14d86cbebce3b0c032e715d559da9c520c7f26d0f3fafd903553618ffa2334ab Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.124979 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.129500 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.129522 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.129507 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:15 crc kubenswrapper[4816]: E0311 12:00:15.129623 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:15 crc kubenswrapper[4816]: E0311 12:00:15.129734 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:15 crc kubenswrapper[4816]: E0311 12:00:15.129794 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.133409 4816 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.612861 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" event={"ID":"f2e52741-8cc7-4b62-8b75-5cae7f35a099","Type":"ContainerStarted","Data":"96fbe9f8a8110e2f89f83793f4dee33eb351439613fc2f280ac1bef28f6ec76e"} Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.612905 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" event={"ID":"f2e52741-8cc7-4b62-8b75-5cae7f35a099","Type":"ContainerStarted","Data":"14d86cbebce3b0c032e715d559da9c520c7f26d0f3fafd903553618ffa2334ab"} Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.616037 4816 generic.go:334] "Generic (PLEG): container finished" podID="020fe9c8-a66d-450b-b7b3-b83bcd2bf552" containerID="69a3bb3f459337a3e6d40ddd15adb16c9d0859a640da085202a485ffb360e548" exitCode=0 Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.616063 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerDied","Data":"69a3bb3f459337a3e6d40ddd15adb16c9d0859a640da085202a485ffb360e548"} Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.631757 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" podStartSLOduration=55.631742633 podStartE2EDuration="55.631742633s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:15.63060197 +0000 UTC m=+102.221865937" watchObservedRunningTime="2026-03-11 12:00:15.631742633 +0000 UTC m=+102.223006600" Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.129898 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:16 crc kubenswrapper[4816]: E0311 12:00:16.130678 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tt4rv" podUID="91b59d67-b771-4a57-b2a8-84303ec4d9bd" Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.235540 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:16 crc kubenswrapper[4816]: E0311 12:00:16.235692 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:16 crc kubenswrapper[4816]: E0311 12:00:16.235746 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs podName:91b59d67-b771-4a57-b2a8-84303ec4d9bd nodeName:}" failed. No retries permitted until 2026-03-11 12:00:20.235731465 +0000 UTC m=+106.826995432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs") pod "network-metrics-daemon-tt4rv" (UID: "91b59d67-b771-4a57-b2a8-84303ec4d9bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.623438 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.623744 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.623776 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.625889 4816 generic.go:334] "Generic (PLEG): container finished" podID="020fe9c8-a66d-450b-b7b3-b83bcd2bf552" containerID="8e0691bd18a723d9479fcad26099fc1053d7625ff2a03ddd90d7658027238c06" exitCode=0 Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.625928 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerDied","Data":"8e0691bd18a723d9479fcad26099fc1053d7625ff2a03ddd90d7658027238c06"} Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.658182 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podStartSLOduration=56.658141562 podStartE2EDuration="56.658141562s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:16.653032586 +0000 UTC m=+103.244296553" watchObservedRunningTime="2026-03-11 12:00:16.658141562 +0000 UTC m=+103.249405529" Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.671748 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:17 crc kubenswrapper[4816]: I0311 12:00:17.129909 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:17 crc kubenswrapper[4816]: E0311 12:00:17.130150 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:17 crc kubenswrapper[4816]: I0311 12:00:17.130318 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:17 crc kubenswrapper[4816]: E0311 12:00:17.130417 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:17 crc kubenswrapper[4816]: I0311 12:00:17.130503 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:17 crc kubenswrapper[4816]: E0311 12:00:17.130718 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:17 crc kubenswrapper[4816]: I0311 12:00:17.634823 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerStarted","Data":"abde7c4c71693e6850ace298ecc3b9148ac00443be5c61daf3e4b93bc817d793"} Mar 11 12:00:17 crc kubenswrapper[4816]: I0311 12:00:17.635328 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:17 crc kubenswrapper[4816]: I0311 12:00:17.671065 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" podStartSLOduration=57.671044474 podStartE2EDuration="57.671044474s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:17.670316553 +0000 UTC m=+104.261580550" watchObservedRunningTime="2026-03-11 12:00:17.671044474 +0000 UTC m=+104.262308441" Mar 11 12:00:17 crc kubenswrapper[4816]: I0311 12:00:17.705699 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:18 crc kubenswrapper[4816]: I0311 12:00:18.129638 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:18 crc kubenswrapper[4816]: E0311 12:00:18.129762 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tt4rv" podUID="91b59d67-b771-4a57-b2a8-84303ec4d9bd" Mar 11 12:00:18 crc kubenswrapper[4816]: I0311 12:00:18.519381 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tt4rv"] Mar 11 12:00:18 crc kubenswrapper[4816]: I0311 12:00:18.637659 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:18 crc kubenswrapper[4816]: E0311 12:00:18.638435 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tt4rv" podUID="91b59d67-b771-4a57-b2a8-84303ec4d9bd" Mar 11 12:00:19 crc kubenswrapper[4816]: I0311 12:00:19.129961 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:19 crc kubenswrapper[4816]: I0311 12:00:19.130003 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:19 crc kubenswrapper[4816]: I0311 12:00:19.130097 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:19 crc kubenswrapper[4816]: E0311 12:00:19.130208 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:19 crc kubenswrapper[4816]: E0311 12:00:19.130353 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:19 crc kubenswrapper[4816]: E0311 12:00:19.130607 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:20 crc kubenswrapper[4816]: I0311 12:00:20.130565 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:20 crc kubenswrapper[4816]: E0311 12:00:20.131120 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tt4rv" podUID="91b59d67-b771-4a57-b2a8-84303ec4d9bd" Mar 11 12:00:20 crc kubenswrapper[4816]: I0311 12:00:20.284011 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:20 crc kubenswrapper[4816]: E0311 12:00:20.284176 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:20 crc kubenswrapper[4816]: E0311 12:00:20.284231 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs podName:91b59d67-b771-4a57-b2a8-84303ec4d9bd nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.284216247 +0000 UTC m=+114.875480214 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs") pod "network-metrics-daemon-tt4rv" (UID: "91b59d67-b771-4a57-b2a8-84303ec4d9bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:20 crc kubenswrapper[4816]: I0311 12:00:20.988723 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 11 12:00:20 crc kubenswrapper[4816]: I0311 12:00:20.988851 4816 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.028450 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pjsgk"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.029375 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t5t6b"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.029523 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.029819 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.030368 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.030796 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.031231 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.031823 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.032307 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz2pp"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.032592 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.035482 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ct9ss"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.035786 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.037063 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.037153 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.037857 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.039084 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.039610 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.039670 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.039889 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.040016 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.042818 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.043068 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.045178 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nv429"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.046262 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.046903 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.050195 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.051946 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.052208 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.056566 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.057065 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.057621 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.058508 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.059104 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.059623 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.059651 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.063572 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.059786 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.079225 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.079325 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.079332 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.079459 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.079682 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.081050 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.081129 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.081192 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.081338 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.081343 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.081144 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.081600 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.083359 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dh658"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.084360 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dh658" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.088643 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.088842 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.089615 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.090201 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.091957 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092585 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092625 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbdb4690-7503-43ee-9e26-34af04f30235-config\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092648 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092677 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-encryption-config\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092700 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t2q6\" (UniqueName: \"kubernetes.io/projected/24bf5f7b-1059-487a-95e7-ab72af29801e-kube-api-access-7t2q6\") pod \"cluster-samples-operator-665b6dd947-gwvvh\" (UID: \"24bf5f7b-1059-487a-95e7-ab72af29801e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092726 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092751 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1d29fc-f278-4f20-8362-3c406634d8ff-serving-cert\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092780 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-config\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092804 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-etcd-client\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092826 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cf7eaa86-2d32-4321-9016-e785320de3e2-images\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092847 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092871 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092893 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/24bf5f7b-1059-487a-95e7-ab72af29801e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gwvvh\" (UID: \"24bf5f7b-1059-487a-95e7-ab72af29801e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092916 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-etcd-client\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092941 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3af1f0c3-1a92-49f9-beec-dff95561c5dd-audit-dir\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092965 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092987 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh5wr\" (UniqueName: \"kubernetes.io/projected/dbdb4690-7503-43ee-9e26-34af04f30235-kube-api-access-fh5wr\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093063 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-image-import-ca\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093101 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093129 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093171 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093194 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csr9c\" (UniqueName: \"kubernetes.io/projected/f0f288b8-4b39-42ac-9835-4fb118a86218-kube-api-access-csr9c\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093214 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj6fz\" (UniqueName: \"kubernetes.io/projected/7ec67c73-6257-41dc-b848-ba547368c957-kube-api-access-mj6fz\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093237 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-service-ca-bundle\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093286 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/17c97aa5-8179-41d7-adcb-c4da341f4cec-audit-dir\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093311 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq894\" (UniqueName: \"kubernetes.io/projected/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-kube-api-access-sq894\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093333 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzblg\" (UniqueName: \"kubernetes.io/projected/ef1d29fc-f278-4f20-8362-3c406634d8ff-kube-api-access-fzblg\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093352 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3af1f0c3-1a92-49f9-beec-dff95561c5dd-node-pullsecrets\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093370 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093390 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj7ng\" (UniqueName: \"kubernetes.io/projected/17c97aa5-8179-41d7-adcb-c4da341f4cec-kube-api-access-hj7ng\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093413 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-config\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093433 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-config\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093453 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47r6l\" (UniqueName: \"kubernetes.io/projected/8c843417-3e01-48f9-b0b6-845fbbbf7eab-kube-api-access-47r6l\") pod \"downloads-7954f5f757-dh658\" (UID: \"8c843417-3e01-48f9-b0b6-845fbbbf7eab\") " pod="openshift-console/downloads-7954f5f757-dh658" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093474 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz99z\" (UniqueName: \"kubernetes.io/projected/3af1f0c3-1a92-49f9-beec-dff95561c5dd-kube-api-access-dz99z\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093493 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-dir\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093513 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093535 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-audit-policies\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093565 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093585 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cqxl\" (UniqueName: \"kubernetes.io/projected/cf7eaa86-2d32-4321-9016-e785320de3e2-kube-api-access-8cqxl\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093604 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093625 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-serving-cert\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093645 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec67c73-6257-41dc-b848-ba547368c957-serving-cert\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093666 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564c2921-e9eb-4a24-a5b7-1a8471d1586b-serving-cert\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093684 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-encryption-config\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093704 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf7eaa86-2d32-4321-9016-e785320de3e2-config\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093735 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dbdb4690-7503-43ee-9e26-34af04f30235-machine-approver-tls\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093756 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-policies\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093775 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf7eaa86-2d32-4321-9016-e785320de3e2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093795 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbdb4690-7503-43ee-9e26-34af04f30235-auth-proxy-config\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093818 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-audit\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093838 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093862 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw5bq\" (UniqueName: \"kubernetes.io/projected/564c2921-e9eb-4a24-a5b7-1a8471d1586b-kube-api-access-xw5bq\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093895 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-serving-cert\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093916 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093945 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-etcd-serving-ca\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093967 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093989 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.094010 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.094032 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-config\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.094054 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-client-ca\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.094073 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-client-ca\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.094486 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.094602 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.095014 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.097988 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098186 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098380 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098557 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098665 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098711 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098758 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098925 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098940 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.099267 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.099357 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.099315 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.099547 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.110372 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.110605 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.110747 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.110873 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.111036 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.112677 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.112859 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.112996 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.113129 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.111321 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.112678 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.114124 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.114278 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.115071 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.115774 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fxsjj"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.116154 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.116505 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.117711 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.117954 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.118110 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.118568 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.118953 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.119076 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.120524 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.149040 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.149896 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.151553 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.151584 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.151742 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-blgl4"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.152686 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.153096 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.155088 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.155627 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mzkr9"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.156150 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.156644 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rft5w"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.156743 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.157440 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.162131 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.162498 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.162564 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.162670 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.170644 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.170970 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.170990 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.171138 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.171326 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.172011 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.172184 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.174752 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.174977 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.177010 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.177235 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.177664 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.177916 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.178080 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.178696 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.178946 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.182374 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.185209 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.185473 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.185649 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.185807 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.186039 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.186266 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.186435 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.186603 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.186794 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.186946 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.187025 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p426k"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.187324 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.187368 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.187603 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.187691 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.188079 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.188227 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.189428 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.193334 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.193970 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.194100 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.194791 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.194992 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195064 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l762\" (UniqueName: \"kubernetes.io/projected/f66d48af-027e-448b-9897-9f0c62fbd6c0-kube-api-access-6l762\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195102 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-config\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195130 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195759 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-client-ca\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195795 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-client-ca\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195820 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f66d48af-027e-448b-9897-9f0c62fbd6c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195841 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f66d48af-027e-448b-9897-9f0c62fbd6c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195861 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195882 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbdb4690-7503-43ee-9e26-34af04f30235-config\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195900 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195931 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-encryption-config\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195957 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t2q6\" (UniqueName: \"kubernetes.io/projected/24bf5f7b-1059-487a-95e7-ab72af29801e-kube-api-access-7t2q6\") pod \"cluster-samples-operator-665b6dd947-gwvvh\" (UID: \"24bf5f7b-1059-487a-95e7-ab72af29801e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195982 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196004 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1d29fc-f278-4f20-8362-3c406634d8ff-serving-cert\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196026 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-config\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196046 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-etcd-client\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196071 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cf7eaa86-2d32-4321-9016-e785320de3e2-images\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196090 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196109 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/24bf5f7b-1059-487a-95e7-ab72af29801e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gwvvh\" (UID: \"24bf5f7b-1059-487a-95e7-ab72af29801e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196135 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196165 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-etcd-client\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196186 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196209 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh5wr\" (UniqueName: \"kubernetes.io/projected/dbdb4690-7503-43ee-9e26-34af04f30235-kube-api-access-fh5wr\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196239 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-image-import-ca\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196273 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3af1f0c3-1a92-49f9-beec-dff95561c5dd-audit-dir\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196293 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196312 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196334 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196358 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csr9c\" (UniqueName: \"kubernetes.io/projected/f0f288b8-4b39-42ac-9835-4fb118a86218-kube-api-access-csr9c\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196378 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj6fz\" (UniqueName: \"kubernetes.io/projected/7ec67c73-6257-41dc-b848-ba547368c957-kube-api-access-mj6fz\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196400 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-service-ca-bundle\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196425 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/17c97aa5-8179-41d7-adcb-c4da341f4cec-audit-dir\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196451 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzblg\" (UniqueName: \"kubernetes.io/projected/ef1d29fc-f278-4f20-8362-3c406634d8ff-kube-api-access-fzblg\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196471 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq894\" (UniqueName: \"kubernetes.io/projected/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-kube-api-access-sq894\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196492 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3af1f0c3-1a92-49f9-beec-dff95561c5dd-node-pullsecrets\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196926 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj7ng\" (UniqueName: \"kubernetes.io/projected/17c97aa5-8179-41d7-adcb-c4da341f4cec-kube-api-access-hj7ng\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196942 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-config\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196964 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-config\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196988 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47r6l\" (UniqueName: \"kubernetes.io/projected/8c843417-3e01-48f9-b0b6-845fbbbf7eab-kube-api-access-47r6l\") pod \"downloads-7954f5f757-dh658\" (UID: \"8c843417-3e01-48f9-b0b6-845fbbbf7eab\") " pod="openshift-console/downloads-7954f5f757-dh658" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197012 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-config\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197032 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-dir\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197051 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197076 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-audit-policies\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197093 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197112 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz99z\" (UniqueName: \"kubernetes.io/projected/3af1f0c3-1a92-49f9-beec-dff95561c5dd-kube-api-access-dz99z\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197134 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cqxl\" (UniqueName: \"kubernetes.io/projected/cf7eaa86-2d32-4321-9016-e785320de3e2-kube-api-access-8cqxl\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197153 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-serving-cert\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec67c73-6257-41dc-b848-ba547368c957-serving-cert\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197189 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197207 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-encryption-config\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197225 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf7eaa86-2d32-4321-9016-e785320de3e2-config\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197267 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564c2921-e9eb-4a24-a5b7-1a8471d1586b-serving-cert\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197286 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dbdb4690-7503-43ee-9e26-34af04f30235-machine-approver-tls\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197309 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-policies\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197328 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf7eaa86-2d32-4321-9016-e785320de3e2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197359 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbdb4690-7503-43ee-9e26-34af04f30235-auth-proxy-config\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197379 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197401 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw5bq\" (UniqueName: \"kubernetes.io/projected/564c2921-e9eb-4a24-a5b7-1a8471d1586b-kube-api-access-xw5bq\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197491 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-client-ca\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197570 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-audit\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197592 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197621 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-serving-cert\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197639 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-etcd-serving-ca\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197659 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197696 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f66d48af-027e-448b-9897-9f0c62fbd6c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197745 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-image-import-ca\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197813 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3af1f0c3-1a92-49f9-beec-dff95561c5dd-audit-dir\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.198539 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.199418 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.199758 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.200774 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.201330 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.201742 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.202237 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.202449 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.205692 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.205763 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.205895 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-client-ca\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.206839 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cf7eaa86-2d32-4321-9016-e785320de3e2-images\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.207345 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.207702 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.208513 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.208898 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.209068 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.209518 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.209546 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.209696 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.211026 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.211077 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbdb4690-7503-43ee-9e26-34af04f30235-config\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.210419 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-config\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.211336 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.211347 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-encryption-config\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.211801 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-service-ca-bundle\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.211860 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/17c97aa5-8179-41d7-adcb-c4da341f4cec-audit-dir\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.212526 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.212588 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.212887 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3af1f0c3-1a92-49f9-beec-dff95561c5dd-node-pullsecrets\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.212964 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-serving-cert\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.213422 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.215091 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-etcd-serving-ca\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.215207 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-config\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.215282 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbdb4690-7503-43ee-9e26-34af04f30235-auth-proxy-config\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.216042 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-etcd-client\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.216518 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-policies\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.217868 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.217880 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-config\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.218671 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t5t6b"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.218724 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6m5gg"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.218974 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-etcd-client\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.219382 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.219971 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.220432 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.221832 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf7eaa86-2d32-4321-9016-e785320de3e2-config\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.221893 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-dir\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.222776 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.224457 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/24bf5f7b-1059-487a-95e7-ab72af29801e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gwvvh\" (UID: \"24bf5f7b-1059-487a-95e7-ab72af29801e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.224624 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.225851 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-audit\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.225937 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf7eaa86-2d32-4321-9016-e785320de3e2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.226466 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-audit-policies\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.226620 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.227090 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec67c73-6257-41dc-b848-ba547368c957-serving-cert\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.227471 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pjsgk"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.227677 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.227744 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.242154 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.242230 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-encryption-config\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.242640 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dbdb4690-7503-43ee-9e26-34af04f30235-machine-approver-tls\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.242747 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.243158 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.243169 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-serving-cert\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.243606 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.244222 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.246790 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1d29fc-f278-4f20-8362-3c406634d8ff-serving-cert\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.249305 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.251283 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-28g7h"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.252880 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.253670 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.254716 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.254956 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zln7t"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.259066 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.261870 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tqt25"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.263424 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564c2921-e9eb-4a24-a5b7-1a8471d1586b-serving-cert\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.266903 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wgxgk"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.270108 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.272495 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.278938 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.279577 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.280303 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.280402 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.281199 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8gcm4"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.281317 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.282009 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.282134 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.282503 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.282763 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.282893 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dh658"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.282920 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz2pp"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.282999 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.283558 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.284823 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nv429"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.285859 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ct9ss"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.286773 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fxsjj"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.287718 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.288645 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.289770 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2ltv9"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.290353 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.290862 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bb6wh"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.291528 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.292202 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rft5w"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.292367 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.292639 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.293625 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.294621 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zln7t"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.295621 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mzkr9"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.296584 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8gcm4"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.297533 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.298637 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.298975 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f66d48af-027e-448b-9897-9f0c62fbd6c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.299010 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l762\" (UniqueName: \"kubernetes.io/projected/f66d48af-027e-448b-9897-9f0c62fbd6c0-kube-api-access-6l762\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.299033 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f66d48af-027e-448b-9897-9f0c62fbd6c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.299050 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f66d48af-027e-448b-9897-9f0c62fbd6c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.299621 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p426k"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.300566 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-blgl4"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.301494 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.302388 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.303259 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2ltv9"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.304309 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.306283 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.307182 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-mws5d"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.307920 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.308173 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wgxgk"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.309218 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-28g7h"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.310142 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.311062 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tqt25"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.312168 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.312269 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.313045 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.313996 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bb6wh"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.315178 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.316174 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.317068 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6bx5p"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.317725 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.318043 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.318952 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.332224 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.352728 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.372459 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.391821 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.412228 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.432534 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.452998 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.481368 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.512732 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.532163 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.552423 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.572302 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.585488 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f66d48af-027e-448b-9897-9f0c62fbd6c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.592180 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.620016 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.632187 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f66d48af-027e-448b-9897-9f0c62fbd6c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.634638 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.653510 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.692165 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.711396 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.731119 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.751854 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.790872 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t2q6\" (UniqueName: \"kubernetes.io/projected/24bf5f7b-1059-487a-95e7-ab72af29801e-kube-api-access-7t2q6\") pod \"cluster-samples-operator-665b6dd947-gwvvh\" (UID: \"24bf5f7b-1059-487a-95e7-ab72af29801e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.792481 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.812622 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.833544 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.852092 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.873608 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.891495 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.926387 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csr9c\" (UniqueName: \"kubernetes.io/projected/f0f288b8-4b39-42ac-9835-4fb118a86218-kube-api-access-csr9c\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.933024 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.951408 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.036181 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.041733 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.041987 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.042980 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.044219 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.072648 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.073241 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.076370 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh5wr\" (UniqueName: \"kubernetes.io/projected/dbdb4690-7503-43ee-9e26-34af04f30235-kube-api-access-fh5wr\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.093436 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.113284 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.133497 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.133826 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.134218 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.153429 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.174978 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.202566 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.214472 4816 request.go:700] Waited for 1.002467244s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.220207 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj6fz\" (UniqueName: \"kubernetes.io/projected/7ec67c73-6257-41dc-b848-ba547368c957-kube-api-access-mj6fz\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.240296 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzblg\" (UniqueName: \"kubernetes.io/projected/ef1d29fc-f278-4f20-8362-3c406634d8ff-kube-api-access-fzblg\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.250007 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq894\" (UniqueName: \"kubernetes.io/projected/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-kube-api-access-sq894\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:22 crc kubenswrapper[4816]: W0311 12:00:22.256186 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbdb4690_7503_43ee_9e26_34af04f30235.slice/crio-5a73d8356cf7871c6377033faae9b412d33fae165064c165f502eb83d942cb9c WatchSource:0}: Error finding container 5a73d8356cf7871c6377033faae9b412d33fae165064c165f502eb83d942cb9c: Status 404 returned error can't find the container with id 5a73d8356cf7871c6377033faae9b412d33fae165064c165f502eb83d942cb9c Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.270432 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj7ng\" (UniqueName: \"kubernetes.io/projected/17c97aa5-8179-41d7-adcb-c4da341f4cec-kube-api-access-hj7ng\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.272394 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.300590 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.301608 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.318165 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz2pp"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.321402 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz99z\" (UniqueName: \"kubernetes.io/projected/3af1f0c3-1a92-49f9-beec-dff95561c5dd-kube-api-access-dz99z\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.321601 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.330183 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cqxl\" (UniqueName: \"kubernetes.io/projected/cf7eaa86-2d32-4321-9016-e785320de3e2-kube-api-access-8cqxl\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:22 crc kubenswrapper[4816]: W0311 12:00:22.338208 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0f288b8_4b39_42ac_9835_4fb118a86218.slice/crio-b0c7d7e2fb960d418e680c2e934ecd5f41d42c08329bcf33576579240438a243 WatchSource:0}: Error finding container b0c7d7e2fb960d418e680c2e934ecd5f41d42c08329bcf33576579240438a243: Status 404 returned error can't find the container with id b0c7d7e2fb960d418e680c2e934ecd5f41d42c08329bcf33576579240438a243 Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.349594 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47r6l\" (UniqueName: \"kubernetes.io/projected/8c843417-3e01-48f9-b0b6-845fbbbf7eab-kube-api-access-47r6l\") pod \"downloads-7954f5f757-dh658\" (UID: \"8c843417-3e01-48f9-b0b6-845fbbbf7eab\") " pod="openshift-console/downloads-7954f5f757-dh658" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.371047 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw5bq\" (UniqueName: \"kubernetes.io/projected/564c2921-e9eb-4a24-a5b7-1a8471d1586b-kube-api-access-xw5bq\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.372354 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.383423 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.392605 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.414310 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.433429 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.452191 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.471803 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.481931 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.485616 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.490521 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.491989 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.509463 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dh658" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.511516 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.531903 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.544035 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.551773 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.572963 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.593810 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.596077 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.611144 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.612646 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.632714 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.649610 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ct9ss"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.651961 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 11 12:00:22 crc kubenswrapper[4816]: W0311 12:00:22.661186 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ec67c73_6257_41dc_b848_ba547368c957.slice/crio-26462429ff0fe4700dd69a2524946294749803dcde6d4034003a12c17da3f2c0 WatchSource:0}: Error finding container 26462429ff0fe4700dd69a2524946294749803dcde6d4034003a12c17da3f2c0: Status 404 returned error can't find the container with id 26462429ff0fe4700dd69a2524946294749803dcde6d4034003a12c17da3f2c0 Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.661661 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" event={"ID":"24bf5f7b-1059-487a-95e7-ab72af29801e","Type":"ContainerStarted","Data":"b3518b60616560a1c089330cd245b2abd91a8eed65804675e1dad99f6d5712ce"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.661700 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" event={"ID":"24bf5f7b-1059-487a-95e7-ab72af29801e","Type":"ContainerStarted","Data":"6baeb34cd2b51a35dd0cbca03c47cac50adcd61d70cad0bfc9e72eccc859313c"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.663941 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" event={"ID":"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d","Type":"ContainerStarted","Data":"e69334ec2dd44c8efdfb80df538d49618111790fbe3b01ecae1f260aaba837f2"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.671977 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.683085 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" event={"ID":"17c97aa5-8179-41d7-adcb-c4da341f4cec","Type":"ContainerStarted","Data":"398450a631a14a605dba9c08f23071654e7fd20ee860c55a4bbb4b0c32cdcd51"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.690091 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" event={"ID":"dbdb4690-7503-43ee-9e26-34af04f30235","Type":"ContainerStarted","Data":"8324b07daaf09810f2847aa811a6467d4dd16ffa03ac6260c18053c46bcfa025"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.690140 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" event={"ID":"dbdb4690-7503-43ee-9e26-34af04f30235","Type":"ContainerStarted","Data":"5a73d8356cf7871c6377033faae9b412d33fae165064c165f502eb83d942cb9c"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.693755 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.698299 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" event={"ID":"f0f288b8-4b39-42ac-9835-4fb118a86218","Type":"ContainerStarted","Data":"e6147c8cc58ce3bae6f999bb1c2d0007faaa3cf350373703380a98dc3aa752bc"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.698471 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.698490 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" event={"ID":"f0f288b8-4b39-42ac-9835-4fb118a86218","Type":"ContainerStarted","Data":"b0c7d7e2fb960d418e680c2e934ecd5f41d42c08329bcf33576579240438a243"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.701204 4816 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-bz2pp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.701275 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" podUID="f0f288b8-4b39-42ac-9835-4fb118a86218" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.711654 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.713582 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.714980 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nv429"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.733969 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.750906 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"964c09610d05fa085a4adc7f7d902f67376a9168848e403cd849cfc2290dc26d"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.751614 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.761725 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.771825 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.785335 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.791143 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.812396 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 11 12:00:22 crc kubenswrapper[4816]: W0311 12:00:22.827165 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef1d29fc_f278_4f20_8362_3c406634d8ff.slice/crio-095fa56b3beb4f734f86a3746d97623146bfffe930c63a78d60c59c578ed0242 WatchSource:0}: Error finding container 095fa56b3beb4f734f86a3746d97623146bfffe930c63a78d60c59c578ed0242: Status 404 returned error can't find the container with id 095fa56b3beb4f734f86a3746d97623146bfffe930c63a78d60c59c578ed0242 Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.833192 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.851464 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.877338 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.887674 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dh658"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.894546 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.915532 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.936696 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.945450 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pjsgk"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.955780 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.971422 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.983708 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t5t6b"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.997188 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.016780 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.032043 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.052168 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.078901 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.093037 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.114461 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.133395 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.152491 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.172064 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.192293 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.212645 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.230247 4816 request.go:700] Waited for 1.939600459s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.231741 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.252334 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.271635 4816 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.292070 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.311341 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.350080 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l762\" (UniqueName: \"kubernetes.io/projected/f66d48af-027e-448b-9897-9f0c62fbd6c0-kube-api-access-6l762\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.366992 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f66d48af-027e-448b-9897-9f0c62fbd6c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.374163 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.392697 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.411850 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.431950 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.492202 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.493273 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.511956 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571524 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0503425c-595f-4ff5-a7eb-c73168d939d5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571626 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-bound-sa-token\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571663 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0503425c-595f-4ff5-a7eb-c73168d939d5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571692 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb999b74-ac20-4e84-b2c7-b16906afbf06-metrics-tls\") pod \"dns-operator-744455d44c-mzkr9\" (UID: \"bb999b74-ac20-4e84-b2c7-b16906afbf06\") " pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571724 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da3678d7-b440-44bd-b73b-2b04f1225094-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571753 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-config\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571795 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-serving-cert\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571859 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9a7e3709-d407-4679-add6-375a835421be-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571894 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/da3678d7-b440-44bd-b73b-2b04f1225094-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571935 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-ca\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571985 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-trusted-ca\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572017 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9a7e3709-d407-4679-add6-375a835421be-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572067 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-oauth-serving-cert\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572097 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rng4c\" (UniqueName: \"kubernetes.io/projected/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-kube-api-access-rng4c\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572158 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-oauth-config\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572190 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-trusted-ca-bundle\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572240 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-config\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572295 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7mdm\" (UniqueName: \"kubernetes.io/projected/da3678d7-b440-44bd-b73b-2b04f1225094-kube-api-access-p7mdm\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572344 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-serving-cert\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572402 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572440 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-service-ca\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572491 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-registry-tls\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572523 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-client\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572550 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgwk\" (UniqueName: \"kubernetes.io/projected/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-kube-api-access-psgwk\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572606 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-registry-certificates\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572637 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-service-ca\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572720 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-serving-cert\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572776 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-serving-cert\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572845 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-trusted-ca\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572934 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da3678d7-b440-44bd-b73b-2b04f1225094-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.573033 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qwwd\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-kube-api-access-6qwwd\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.573071 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.573104 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfq9m\" (UniqueName: \"kubernetes.io/projected/0503425c-595f-4ff5-a7eb-c73168d939d5-kube-api-access-lfq9m\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.573928 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdnwz\" (UniqueName: \"kubernetes.io/projected/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-kube-api-access-qdnwz\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.573963 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dlk7\" (UniqueName: \"kubernetes.io/projected/bb999b74-ac20-4e84-b2c7-b16906afbf06-kube-api-access-9dlk7\") pod \"dns-operator-744455d44c-mzkr9\" (UID: \"bb999b74-ac20-4e84-b2c7-b16906afbf06\") " pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.573985 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-config\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.574005 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nngrf\" (UniqueName: \"kubernetes.io/projected/efc988f7-8a1a-4d22-b6bb-b2617c721017-kube-api-access-nngrf\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: E0311 12:00:23.575632 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.075611886 +0000 UTC m=+110.666875963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.667358 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr"] Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675118 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675334 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-config\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675372 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-certs\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675397 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675419 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675440 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a17171b-c738-4862-a2a0-cbb09219322a-config-volume\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675473 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b9c2804-ee65-4a09-9985-d2345aa7f82a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675493 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00d6d506-7c84-4fef-9dc9-85f855533c06-apiservice-cert\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675513 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b74d12c-0a8c-48b1-9931-950ea6e20d4a-cert\") pod \"ingress-canary-2ltv9\" (UID: \"1b74d12c-0a8c-48b1-9931-950ea6e20d4a\") " pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675547 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgbd4\" (UniqueName: \"kubernetes.io/projected/9a782b5b-9eac-4b5b-8ca8-751111b2459b-kube-api-access-zgbd4\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675570 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57df17b9-73f2-468a-8359-5a07f19a5493-serving-cert\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675591 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtlst\" (UniqueName: \"kubernetes.io/projected/57df17b9-73f2-468a-8359-5a07f19a5493-kube-api-access-rtlst\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675611 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slmk9\" (UniqueName: \"kubernetes.io/projected/00d6d506-7c84-4fef-9dc9-85f855533c06-kube-api-access-slmk9\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675659 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57df17b9-73f2-468a-8359-5a07f19a5493-config\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675694 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-socket-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675716 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zj2p\" (UniqueName: \"kubernetes.io/projected/ba5682ea-6a62-4983-b525-5dc9612ad46d-kube-api-access-5zj2p\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675741 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675765 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/546d4851-e1c7-418b-8ba6-5847e5f9efde-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675789 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psgwk\" (UniqueName: \"kubernetes.io/projected/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-kube-api-access-psgwk\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675812 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-stats-auth\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675837 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-default-certificate\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675869 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-service-ca\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675891 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-serving-cert\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675942 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-proxy-tls\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675967 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/750d6f55-7cf7-4376-8ead-6d481db69c2d-srv-cert\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676006 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxw9g\" (UniqueName: \"kubernetes.io/projected/546d4851-e1c7-418b-8ba6-5847e5f9efde-kube-api-access-sxw9g\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676029 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/027b1711-77a0-4359-bd98-246217fdb5f8-service-ca-bundle\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676049 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86nkz\" (UniqueName: \"kubernetes.io/projected/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-kube-api-access-86nkz\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676095 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/db49f265-44d3-468b-8e2f-2246b02b57be-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksjm4\" (UID: \"db49f265-44d3-468b-8e2f-2246b02b57be\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676123 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr7ks\" (UniqueName: \"kubernetes.io/projected/2eaac3e7-6f80-47da-a6c7-e415a0b8edbd-kube-api-access-gr7ks\") pod \"migrator-59844c95c7-4kd2n\" (UID: \"2eaac3e7-6f80-47da-a6c7-e415a0b8edbd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676145 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89cbz\" (UniqueName: \"kubernetes.io/projected/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-kube-api-access-89cbz\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676168 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22d5j\" (UniqueName: \"kubernetes.io/projected/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-kube-api-access-22d5j\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676190 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a17171b-c738-4862-a2a0-cbb09219322a-metrics-tls\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676278 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-bound-sa-token\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676303 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b9c2804-ee65-4a09-9985-d2345aa7f82a-config\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676324 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxdsn\" (UniqueName: \"kubernetes.io/projected/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-kube-api-access-nxdsn\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676349 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb999b74-ac20-4e84-b2c7-b16906afbf06-metrics-tls\") pod \"dns-operator-744455d44c-mzkr9\" (UID: \"bb999b74-ac20-4e84-b2c7-b16906afbf06\") " pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676373 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0503425c-595f-4ff5-a7eb-c73168d939d5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676410 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da3678d7-b440-44bd-b73b-2b04f1225094-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676431 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-serving-cert\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676455 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9a7e3709-d407-4679-add6-375a835421be-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676478 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/da3678d7-b440-44bd-b73b-2b04f1225094-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676499 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a782b5b-9eac-4b5b-8ca8-751111b2459b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676526 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-trusted-ca\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676555 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9a7e3709-d407-4679-add6-375a835421be-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676579 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/680978cb-e609-4292-827f-cc8a5b9c1438-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zln7t\" (UID: \"680978cb-e609-4292-827f-cc8a5b9c1438\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676602 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-oauth-serving-cert\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676625 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rng4c\" (UniqueName: \"kubernetes.io/projected/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-kube-api-access-rng4c\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676661 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7mdm\" (UniqueName: \"kubernetes.io/projected/da3678d7-b440-44bd-b73b-2b04f1225094-kube-api-access-p7mdm\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676686 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-node-bootstrap-token\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676708 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n5p6\" (UniqueName: \"kubernetes.io/projected/7aff6a5d-2a66-4ab5-ad53-878f5fea4115-kube-api-access-6n5p6\") pod \"package-server-manager-789f6589d5-6t4jp\" (UID: \"7aff6a5d-2a66-4ab5-ad53-878f5fea4115\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676732 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-serving-cert\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676754 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7wqx\" (UniqueName: \"kubernetes.io/projected/750d6f55-7cf7-4376-8ead-6d481db69c2d-kube-api-access-g7wqx\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676795 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-service-ca\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676841 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6qxn\" (UniqueName: \"kubernetes.io/projected/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-kube-api-access-g6qxn\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676862 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-srv-cert\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676883 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67dd48ce-6361-442d-9552-f06346e4d8d4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676911 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-registry-tls\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676934 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c040a86-9614-48cb-9df7-14c83b046dce-secret-volume\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676959 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676981 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-plugins-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677002 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677043 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnlqr\" (UniqueName: \"kubernetes.io/projected/db49f265-44d3-468b-8e2f-2246b02b57be-kube-api-access-nnlqr\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksjm4\" (UID: \"db49f265-44d3-468b-8e2f-2246b02b57be\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677069 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-client\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677090 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-config\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677123 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-registration-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677147 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjg5g\" (UniqueName: \"kubernetes.io/projected/027b1711-77a0-4359-bd98-246217fdb5f8-kube-api-access-zjg5g\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677168 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67dd48ce-6361-442d-9552-f06346e4d8d4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677191 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-registry-certificates\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677214 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-images\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677237 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-proxy-tls\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677330 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-serving-cert\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677353 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7qfb\" (UniqueName: \"kubernetes.io/projected/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-kube-api-access-z7qfb\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677375 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-signing-cabundle\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677395 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00d6d506-7c84-4fef-9dc9-85f855533c06-webhook-cert\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677423 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-trusted-ca\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677448 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a782b5b-9eac-4b5b-8ca8-751111b2459b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677471 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghclh\" (UniqueName: \"kubernetes.io/projected/1b74d12c-0a8c-48b1-9931-950ea6e20d4a-kube-api-access-ghclh\") pod \"ingress-canary-2ltv9\" (UID: \"1b74d12c-0a8c-48b1-9931-950ea6e20d4a\") " pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677512 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da3678d7-b440-44bd-b73b-2b04f1225094-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677541 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qwwd\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-kube-api-access-6qwwd\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677567 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677593 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfq9m\" (UniqueName: \"kubernetes.io/projected/0503425c-595f-4ff5-a7eb-c73168d939d5-kube-api-access-lfq9m\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677619 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c040a86-9614-48cb-9df7-14c83b046dce-config-volume\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677657 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdnwz\" (UniqueName: \"kubernetes.io/projected/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-kube-api-access-qdnwz\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677695 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dlk7\" (UniqueName: \"kubernetes.io/projected/bb999b74-ac20-4e84-b2c7-b16906afbf06-kube-api-access-9dlk7\") pod \"dns-operator-744455d44c-mzkr9\" (UID: \"bb999b74-ac20-4e84-b2c7-b16906afbf06\") " pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677722 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-config\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677747 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nngrf\" (UniqueName: \"kubernetes.io/projected/efc988f7-8a1a-4d22-b6bb-b2617c721017-kube-api-access-nngrf\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677770 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b9c2804-ee65-4a09-9985-d2345aa7f82a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677793 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0503425c-595f-4ff5-a7eb-c73168d939d5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677827 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/750d6f55-7cf7-4376-8ead-6d481db69c2d-profile-collector-cert\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677850 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677890 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-config\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677914 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/546d4851-e1c7-418b-8ba6-5847e5f9efde-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677952 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/00d6d506-7c84-4fef-9dc9-85f855533c06-tmpfs\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677973 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-metrics-certs\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677995 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgkcr\" (UniqueName: \"kubernetes.io/projected/680978cb-e609-4292-827f-cc8a5b9c1438-kube-api-access-bgkcr\") pod \"multus-admission-controller-857f4d67dd-zln7t\" (UID: \"680978cb-e609-4292-827f-cc8a5b9c1438\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678018 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-signing-key\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678040 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/546d4851-e1c7-418b-8ba6-5847e5f9efde-ready\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678061 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678113 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-ca\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678138 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hbdh\" (UniqueName: \"kubernetes.io/projected/3c040a86-9614-48cb-9df7-14c83b046dce-kube-api-access-9hbdh\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678162 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd48ce-6361-442d-9552-f06346e4d8d4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678214 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-csi-data-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678239 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9nvp\" (UniqueName: \"kubernetes.io/projected/1a17171b-c738-4862-a2a0-cbb09219322a-kube-api-access-q9nvp\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678285 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-mountpoint-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678321 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aff6a5d-2a66-4ab5-ad53-878f5fea4115-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6t4jp\" (UID: \"7aff6a5d-2a66-4ab5-ad53-878f5fea4115\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678366 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-oauth-config\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678392 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-trusted-ca-bundle\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678946 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-config\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.680341 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-trusted-ca\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.681442 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-service-ca\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.681851 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-registry-certificates\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.682333 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb999b74-ac20-4e84-b2c7-b16906afbf06-metrics-tls\") pod \"dns-operator-744455d44c-mzkr9\" (UID: \"bb999b74-ac20-4e84-b2c7-b16906afbf06\") " pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.682505 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-ca\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.683140 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da3678d7-b440-44bd-b73b-2b04f1225094-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: E0311 12:00:23.683205 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.183174053 +0000 UTC m=+110.774438110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.683537 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.684427 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-config\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.685067 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0503425c-595f-4ff5-a7eb-c73168d939d5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.685792 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-config\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.687049 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-trusted-ca\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.687118 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-trusted-ca-bundle\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.687977 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-client\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.688289 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-oauth-config\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.688731 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-oauth-serving-cert\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.688935 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9a7e3709-d407-4679-add6-375a835421be-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.689293 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0503425c-595f-4ff5-a7eb-c73168d939d5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.689556 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-serving-cert\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.689677 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-service-ca\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.689867 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-serving-cert\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.691740 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9a7e3709-d407-4679-add6-375a835421be-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.692834 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-serving-cert\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.693536 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/da3678d7-b440-44bd-b73b-2b04f1225094-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.695868 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-registry-tls\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.697728 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-serving-cert\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.708448 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgwk\" (UniqueName: \"kubernetes.io/projected/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-kube-api-access-psgwk\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.727818 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-bound-sa-token\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.759591 4816 generic.go:334] "Generic (PLEG): container finished" podID="3af1f0c3-1a92-49f9-beec-dff95561c5dd" containerID="8cfbeb80eec0131a6c2a8dc0fdd78c6711bab4e499b7b7166380c7f4003724de" exitCode=0 Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.759843 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" event={"ID":"3af1f0c3-1a92-49f9-beec-dff95561c5dd","Type":"ContainerDied","Data":"8cfbeb80eec0131a6c2a8dc0fdd78c6711bab4e499b7b7166380c7f4003724de"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.759883 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" event={"ID":"3af1f0c3-1a92-49f9-beec-dff95561c5dd","Type":"ContainerStarted","Data":"eb951600f906e521d2c215a58ecae086e8409bd458dc9b1ba7e747be45d886e0"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.762746 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" event={"ID":"ef1d29fc-f278-4f20-8362-3c406634d8ff","Type":"ContainerStarted","Data":"067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.762789 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" event={"ID":"ef1d29fc-f278-4f20-8362-3c406634d8ff","Type":"ContainerStarted","Data":"095fa56b3beb4f734f86a3746d97623146bfffe930c63a78d60c59c578ed0242"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.762969 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.766702 4816 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-cdscr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.766748 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" podUID="ef1d29fc-f278-4f20-8362-3c406634d8ff" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.767432 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dh658" event={"ID":"8c843417-3e01-48f9-b0b6-845fbbbf7eab","Type":"ContainerStarted","Data":"7d4edc05806ccc7dd99c5bfe1808a0dd4314990cfad0ea42ace041972c048777"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.767480 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dh658" event={"ID":"8c843417-3e01-48f9-b0b6-845fbbbf7eab","Type":"ContainerStarted","Data":"db4a27d14ba72b2bcb597e8b4ff67b1e635ed33d4c53964f9c2bd5f7226df206"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.768027 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dh658" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.770021 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-dh658 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.770069 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dh658" podUID="8c843417-3e01-48f9-b0b6-845fbbbf7eab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.771562 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.772563 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" event={"ID":"dbdb4690-7503-43ee-9e26-34af04f30235","Type":"ContainerStarted","Data":"20f258f941006420dfef84a373d133fce72f2dc844e52d53a66f60e96f528fab"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.774236 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" event={"ID":"564c2921-e9eb-4a24-a5b7-1a8471d1586b","Type":"ContainerStarted","Data":"35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.774327 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" event={"ID":"564c2921-e9eb-4a24-a5b7-1a8471d1586b","Type":"ContainerStarted","Data":"306382581adac0ac9b7eb96a682fee969c6c0324fd34514acd435886ca5bcb46"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.774519 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.775825 4816 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nv429 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.775852 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" event={"ID":"f66d48af-027e-448b-9897-9f0c62fbd6c0","Type":"ContainerStarted","Data":"1df3921167d38bf995bc22e8726ca8c5b61c735e26979c6e24183fba2992b175"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.775879 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" podUID="564c2921-e9eb-4a24-a5b7-1a8471d1586b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.778501 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" event={"ID":"cf7eaa86-2d32-4321-9016-e785320de3e2","Type":"ContainerStarted","Data":"0ca81d83bc445fe476353eaa69afe132c69988440474ffaf798ef0516dc80c8d"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.778531 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" event={"ID":"cf7eaa86-2d32-4321-9016-e785320de3e2","Type":"ContainerStarted","Data":"4f20efb5ad790d2fc91aac2f36f1b7923395b253a8e689483cd2fcfc5b686b03"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.778543 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" event={"ID":"cf7eaa86-2d32-4321-9016-e785320de3e2","Type":"ContainerStarted","Data":"c8622f1138809cd8e8f26d8decac12051c54c9c0662dfbda911346d18513e58e"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.778582 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dlk7\" (UniqueName: \"kubernetes.io/projected/bb999b74-ac20-4e84-b2c7-b16906afbf06-kube-api-access-9dlk7\") pod \"dns-operator-744455d44c-mzkr9\" (UID: \"bb999b74-ac20-4e84-b2c7-b16906afbf06\") " pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779016 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22d5j\" (UniqueName: \"kubernetes.io/projected/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-kube-api-access-22d5j\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779071 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a17171b-c738-4862-a2a0-cbb09219322a-metrics-tls\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779104 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b9c2804-ee65-4a09-9985-d2345aa7f82a-config\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779168 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxdsn\" (UniqueName: \"kubernetes.io/projected/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-kube-api-access-nxdsn\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779236 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a782b5b-9eac-4b5b-8ca8-751111b2459b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779301 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/680978cb-e609-4292-827f-cc8a5b9c1438-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zln7t\" (UID: \"680978cb-e609-4292-827f-cc8a5b9c1438\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779345 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-node-bootstrap-token\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779378 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n5p6\" (UniqueName: \"kubernetes.io/projected/7aff6a5d-2a66-4ab5-ad53-878f5fea4115-kube-api-access-6n5p6\") pod \"package-server-manager-789f6589d5-6t4jp\" (UID: \"7aff6a5d-2a66-4ab5-ad53-878f5fea4115\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779426 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7wqx\" (UniqueName: \"kubernetes.io/projected/750d6f55-7cf7-4376-8ead-6d481db69c2d-kube-api-access-g7wqx\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779477 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67dd48ce-6361-442d-9552-f06346e4d8d4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779513 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6qxn\" (UniqueName: \"kubernetes.io/projected/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-kube-api-access-g6qxn\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779543 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-srv-cert\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779586 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c040a86-9614-48cb-9df7-14c83b046dce-secret-volume\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779621 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnlqr\" (UniqueName: \"kubernetes.io/projected/db49f265-44d3-468b-8e2f-2246b02b57be-kube-api-access-nnlqr\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksjm4\" (UID: \"db49f265-44d3-468b-8e2f-2246b02b57be\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779658 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779691 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-plugins-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779728 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779784 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-config\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779817 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-registration-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779851 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjg5g\" (UniqueName: \"kubernetes.io/projected/027b1711-77a0-4359-bd98-246217fdb5f8-kube-api-access-zjg5g\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779860 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b9c2804-ee65-4a09-9985-d2345aa7f82a-config\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779885 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67dd48ce-6361-442d-9552-f06346e4d8d4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779934 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-images\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779968 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-proxy-tls\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780069 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-signing-cabundle\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780110 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00d6d506-7c84-4fef-9dc9-85f855533c06-webhook-cert\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780147 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7qfb\" (UniqueName: \"kubernetes.io/projected/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-kube-api-access-z7qfb\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780180 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghclh\" (UniqueName: \"kubernetes.io/projected/1b74d12c-0a8c-48b1-9931-950ea6e20d4a-kube-api-access-ghclh\") pod \"ingress-canary-2ltv9\" (UID: \"1b74d12c-0a8c-48b1-9931-950ea6e20d4a\") " pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780212 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a782b5b-9eac-4b5b-8ca8-751111b2459b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780286 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c040a86-9614-48cb-9df7-14c83b046dce-config-volume\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780333 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b9c2804-ee65-4a09-9985-d2345aa7f82a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780387 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/750d6f55-7cf7-4376-8ead-6d481db69c2d-profile-collector-cert\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780415 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780446 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/546d4851-e1c7-418b-8ba6-5847e5f9efde-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780476 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgkcr\" (UniqueName: \"kubernetes.io/projected/680978cb-e609-4292-827f-cc8a5b9c1438-kube-api-access-bgkcr\") pod \"multus-admission-controller-857f4d67dd-zln7t\" (UID: \"680978cb-e609-4292-827f-cc8a5b9c1438\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780506 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/00d6d506-7c84-4fef-9dc9-85f855533c06-tmpfs\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780534 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-metrics-certs\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780564 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-signing-key\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780593 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/546d4851-e1c7-418b-8ba6-5847e5f9efde-ready\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780622 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780655 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hbdh\" (UniqueName: \"kubernetes.io/projected/3c040a86-9614-48cb-9df7-14c83b046dce-kube-api-access-9hbdh\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780686 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd48ce-6361-442d-9552-f06346e4d8d4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780718 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-csi-data-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780749 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9nvp\" (UniqueName: \"kubernetes.io/projected/1a17171b-c738-4862-a2a0-cbb09219322a-kube-api-access-q9nvp\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780779 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-mountpoint-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780814 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aff6a5d-2a66-4ab5-ad53-878f5fea4115-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6t4jp\" (UID: \"7aff6a5d-2a66-4ab5-ad53-878f5fea4115\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780848 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780880 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-certs\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780908 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780938 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a17171b-c738-4862-a2a0-cbb09219322a-config-volume\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780976 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781008 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b9c2804-ee65-4a09-9985-d2345aa7f82a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781041 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00d6d506-7c84-4fef-9dc9-85f855533c06-apiservice-cert\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781078 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b74d12c-0a8c-48b1-9931-950ea6e20d4a-cert\") pod \"ingress-canary-2ltv9\" (UID: \"1b74d12c-0a8c-48b1-9931-950ea6e20d4a\") " pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781224 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slmk9\" (UniqueName: \"kubernetes.io/projected/00d6d506-7c84-4fef-9dc9-85f855533c06-kube-api-access-slmk9\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781357 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgbd4\" (UniqueName: \"kubernetes.io/projected/9a782b5b-9eac-4b5b-8ca8-751111b2459b-kube-api-access-zgbd4\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781404 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57df17b9-73f2-468a-8359-5a07f19a5493-serving-cert\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781437 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtlst\" (UniqueName: \"kubernetes.io/projected/57df17b9-73f2-468a-8359-5a07f19a5493-kube-api-access-rtlst\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783197 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57df17b9-73f2-468a-8359-5a07f19a5493-config\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783271 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-socket-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783306 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zj2p\" (UniqueName: \"kubernetes.io/projected/ba5682ea-6a62-4983-b525-5dc9612ad46d-kube-api-access-5zj2p\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783344 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/546d4851-e1c7-418b-8ba6-5847e5f9efde-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783377 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783471 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-stats-auth\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783505 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-default-certificate\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783543 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-proxy-tls\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783574 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/750d6f55-7cf7-4376-8ead-6d481db69c2d-srv-cert\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783606 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxw9g\" (UniqueName: \"kubernetes.io/projected/546d4851-e1c7-418b-8ba6-5847e5f9efde-kube-api-access-sxw9g\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783661 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/027b1711-77a0-4359-bd98-246217fdb5f8-service-ca-bundle\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783696 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86nkz\" (UniqueName: \"kubernetes.io/projected/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-kube-api-access-86nkz\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783733 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/db49f265-44d3-468b-8e2f-2246b02b57be-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksjm4\" (UID: \"db49f265-44d3-468b-8e2f-2246b02b57be\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783770 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89cbz\" (UniqueName: \"kubernetes.io/projected/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-kube-api-access-89cbz\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783807 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr7ks\" (UniqueName: \"kubernetes.io/projected/2eaac3e7-6f80-47da-a6c7-e415a0b8edbd-kube-api-access-gr7ks\") pod \"migrator-59844c95c7-4kd2n\" (UID: \"2eaac3e7-6f80-47da-a6c7-e415a0b8edbd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.784065 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-csi-data-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.784235 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-mountpoint-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.784357 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a17171b-c738-4862-a2a0-cbb09219322a-metrics-tls\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.784809 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-plugins-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.785743 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-config\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.782074 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.785954 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-registration-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.786377 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a782b5b-9eac-4b5b-8ca8-751111b2459b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.786880 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qwwd\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-kube-api-access-6qwwd\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.787065 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.787157 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c040a86-9614-48cb-9df7-14c83b046dce-secret-volume\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.787309 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/00d6d506-7c84-4fef-9dc9-85f855533c06-tmpfs\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.787459 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-node-bootstrap-token\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.787618 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c040a86-9614-48cb-9df7-14c83b046dce-config-volume\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.787830 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/546d4851-e1c7-418b-8ba6-5847e5f9efde-ready\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.788208 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-images\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781808 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a782b5b-9eac-4b5b-8ca8-751111b2459b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: E0311 12:00:23.788724 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.288704293 +0000 UTC m=+110.879968350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.789176 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aff6a5d-2a66-4ab5-ad53-878f5fea4115-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6t4jp\" (UID: \"7aff6a5d-2a66-4ab5-ad53-878f5fea4115\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.790467 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b9c2804-ee65-4a09-9985-d2345aa7f82a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.791954 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57df17b9-73f2-468a-8359-5a07f19a5493-serving-cert\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.792710 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67dd48ce-6361-442d-9552-f06346e4d8d4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.792806 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.793208 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b74d12c-0a8c-48b1-9931-950ea6e20d4a-cert\") pod \"ingress-canary-2ltv9\" (UID: \"1b74d12c-0a8c-48b1-9931-950ea6e20d4a\") " pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.793374 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-socket-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.782526 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd48ce-6361-442d-9552-f06346e4d8d4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.793588 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/546d4851-e1c7-418b-8ba6-5847e5f9efde-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.793938 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-srv-cert\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.794054 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.794954 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/546d4851-e1c7-418b-8ba6-5847e5f9efde-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.794967 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a17171b-c738-4862-a2a0-cbb09219322a-config-volume\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.795572 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57df17b9-73f2-468a-8359-5a07f19a5493-config\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.796634 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/027b1711-77a0-4359-bd98-246217fdb5f8-service-ca-bundle\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.796771 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00d6d506-7c84-4fef-9dc9-85f855533c06-apiservice-cert\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.797058 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" event={"ID":"7ec67c73-6257-41dc-b848-ba547368c957","Type":"ContainerStarted","Data":"48eaaf8bde0e4a521556bf18eeb616907cc3beb3e02db6e11a51d124d0e2839b"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.797172 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" event={"ID":"7ec67c73-6257-41dc-b848-ba547368c957","Type":"ContainerStarted","Data":"26462429ff0fe4700dd69a2524946294749803dcde6d4034003a12c17da3f2c0"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.797598 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.798100 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-default-certificate\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.800481 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00d6d506-7c84-4fef-9dc9-85f855533c06-webhook-cert\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.801106 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-proxy-tls\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.802072 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-signing-cabundle\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.802096 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-proxy-tls\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.802148 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-signing-key\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.802468 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/750d6f55-7cf7-4376-8ead-6d481db69c2d-profile-collector-cert\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.803005 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.807071 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/680978cb-e609-4292-827f-cc8a5b9c1438-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zln7t\" (UID: \"680978cb-e609-4292-827f-cc8a5b9c1438\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.807744 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" event={"ID":"24bf5f7b-1059-487a-95e7-ab72af29801e","Type":"ContainerStarted","Data":"5712611ce3f2b1aa74bbf006f99e5c6f0c92075d2b5f25c92184d1dc6922a9f0"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.807919 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-stats-auth\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.807930 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/750d6f55-7cf7-4376-8ead-6d481db69c2d-srv-cert\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.809384 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-certs\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.809468 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdnwz\" (UniqueName: \"kubernetes.io/projected/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-kube-api-access-qdnwz\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.809691 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-metrics-certs\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.812038 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/db49f265-44d3-468b-8e2f-2246b02b57be-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksjm4\" (UID: \"db49f265-44d3-468b-8e2f-2246b02b57be\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.819866 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" event={"ID":"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d","Type":"ContainerStarted","Data":"e869ba3f6495b26f3b312a5c8e8d5d8453d670e146550e509dd400f901b40af5"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.823160 4816 generic.go:334] "Generic (PLEG): container finished" podID="17c97aa5-8179-41d7-adcb-c4da341f4cec" containerID="fe3c413ed111e88244d59768575fe66a8310b3c2565efff2138e070edf0ec984" exitCode=0 Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.823545 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" event={"ID":"17c97aa5-8179-41d7-adcb-c4da341f4cec","Type":"ContainerDied","Data":"fe3c413ed111e88244d59768575fe66a8310b3c2565efff2138e070edf0ec984"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.829326 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nngrf\" (UniqueName: \"kubernetes.io/projected/efc988f7-8a1a-4d22-b6bb-b2617c721017-kube-api-access-nngrf\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.851090 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da3678d7-b440-44bd-b73b-2b04f1225094-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.868663 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7mdm\" (UniqueName: \"kubernetes.io/projected/da3678d7-b440-44bd-b73b-2b04f1225094-kube-api-access-p7mdm\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.887627 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:23 crc kubenswrapper[4816]: E0311 12:00:23.888635 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.388620962 +0000 UTC m=+110.979884929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.893717 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfq9m\" (UniqueName: \"kubernetes.io/projected/0503425c-595f-4ff5-a7eb-c73168d939d5-kube-api-access-lfq9m\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.898465 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.919484 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rng4c\" (UniqueName: \"kubernetes.io/projected/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-kube-api-access-rng4c\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.952948 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22d5j\" (UniqueName: \"kubernetes.io/projected/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-kube-api-access-22d5j\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.968138 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.973211 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7qfb\" (UniqueName: \"kubernetes.io/projected/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-kube-api-access-z7qfb\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.990144 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: E0311 12:00:23.993094 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.493081771 +0000 UTC m=+111.084345738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.996120 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxdsn\" (UniqueName: \"kubernetes.io/projected/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-kube-api-access-nxdsn\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.009035 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6qxn\" (UniqueName: \"kubernetes.io/projected/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-kube-api-access-g6qxn\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.018133 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.024814 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.028704 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnlqr\" (UniqueName: \"kubernetes.io/projected/db49f265-44d3-468b-8e2f-2246b02b57be-kube-api-access-nnlqr\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksjm4\" (UID: \"db49f265-44d3-468b-8e2f-2246b02b57be\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.032317 4816 ???:1] "http: TLS handshake error from 192.168.126.11:58756: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.048789 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.066558 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.071941 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.072816 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghclh\" (UniqueName: \"kubernetes.io/projected/1b74d12c-0a8c-48b1-9931-950ea6e20d4a-kube-api-access-ghclh\") pod \"ingress-canary-2ltv9\" (UID: \"1b74d12c-0a8c-48b1-9931-950ea6e20d4a\") " pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.085768 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr7ks\" (UniqueName: \"kubernetes.io/projected/2eaac3e7-6f80-47da-a6c7-e415a0b8edbd-kube-api-access-gr7ks\") pod \"migrator-59844c95c7-4kd2n\" (UID: \"2eaac3e7-6f80-47da-a6c7-e415a0b8edbd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.097356 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.097998 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.597982513 +0000 UTC m=+111.189246480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.099696 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9nvp\" (UniqueName: \"kubernetes.io/projected/1a17171b-c738-4862-a2a0-cbb09219322a-kube-api-access-q9nvp\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.101255 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rft5w"] Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.123773 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.123893 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" Mar 11 12:00:24 crc kubenswrapper[4816]: W0311 12:00:24.130598 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0c986ee_b3e9_4bd1_ae9c_7a70b04e1527.slice/crio-c8662d9af8cd0a9090c14a1f5f335228a3b31c7b8b889472214083ecc6cbeaa4 WatchSource:0}: Error finding container c8662d9af8cd0a9090c14a1f5f335228a3b31c7b8b889472214083ecc6cbeaa4: Status 404 returned error can't find the container with id c8662d9af8cd0a9090c14a1f5f335228a3b31c7b8b889472214083ecc6cbeaa4 Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.132715 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.145214 4816 ???:1] "http: TLS handshake error from 192.168.126.11:58764: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.145344 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.153935 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.154264 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjg5g\" (UniqueName: \"kubernetes.io/projected/027b1711-77a0-4359-bd98-246217fdb5f8-kube-api-access-zjg5g\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.156868 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67dd48ce-6361-442d-9552-f06346e4d8d4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.161568 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.166497 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.182565 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n5p6\" (UniqueName: \"kubernetes.io/projected/7aff6a5d-2a66-4ab5-ad53-878f5fea4115-kube-api-access-6n5p6\") pod \"package-server-manager-789f6589d5-6t4jp\" (UID: \"7aff6a5d-2a66-4ab5-ad53-878f5fea4115\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.195577 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.196184 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv"] Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.198421 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7wqx\" (UniqueName: \"kubernetes.io/projected/750d6f55-7cf7-4376-8ead-6d481db69c2d-kube-api-access-g7wqx\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.199319 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.199729 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.699716494 +0000 UTC m=+111.290980461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.207766 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.208461 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hbdh\" (UniqueName: \"kubernetes.io/projected/3c040a86-9614-48cb-9df7-14c83b046dce-kube-api-access-9hbdh\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.222454 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.231726 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.235720 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slmk9\" (UniqueName: \"kubernetes.io/projected/00d6d506-7c84-4fef-9dc9-85f855533c06-kube-api-access-slmk9\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.244511 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.245647 4816 ???:1] "http: TLS handshake error from 192.168.126.11:58768: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.259204 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtlst\" (UniqueName: \"kubernetes.io/projected/57df17b9-73f2-468a-8359-5a07f19a5493-kube-api-access-rtlst\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.275668 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b9c2804-ee65-4a09-9985-d2345aa7f82a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.276523 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.278584 4816 ???:1] "http: TLS handshake error from 192.168.126.11:58772: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.288857 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgbd4\" (UniqueName: \"kubernetes.io/projected/9a782b5b-9eac-4b5b-8ca8-751111b2459b-kube-api-access-zgbd4\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.299963 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.300881 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.800862338 +0000 UTC m=+111.392126305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.319997 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zj2p\" (UniqueName: \"kubernetes.io/projected/ba5682ea-6a62-4983-b525-5dc9612ad46d-kube-api-access-5zj2p\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.324363 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn"] Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.346379 4816 ???:1] "http: TLS handshake error from 192.168.126.11:58788: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.349056 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgkcr\" (UniqueName: \"kubernetes.io/projected/680978cb-e609-4292-827f-cc8a5b9c1438-kube-api-access-bgkcr\") pod \"multus-admission-controller-857f4d67dd-zln7t\" (UID: \"680978cb-e609-4292-827f-cc8a5b9c1438\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.369696 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86nkz\" (UniqueName: \"kubernetes.io/projected/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-kube-api-access-86nkz\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.381118 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxw9g\" (UniqueName: \"kubernetes.io/projected/546d4851-e1c7-418b-8ba6-5847e5f9efde-kube-api-access-sxw9g\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.402100 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.402433 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.902420053 +0000 UTC m=+111.493684020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.405567 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.408501 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89cbz\" (UniqueName: \"kubernetes.io/projected/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-kube-api-access-89cbz\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.411320 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.419406 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.438740 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.458713 4816 ???:1] "http: TLS handshake error from 192.168.126.11:58804: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.471717 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.485580 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.489485 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.495733 4816 ???:1] "http: TLS handshake error from 192.168.126.11:58816: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.502310 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.502665 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.503204 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.003182266 +0000 UTC m=+111.594446233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.514815 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.566593 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.581099 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.604563 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.604985 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.104963888 +0000 UTC m=+111.696227885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.667442 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fxsjj"] Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.690800 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54110: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.706800 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.707474 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.207449721 +0000 UTC m=+111.798713688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.808550 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.808906 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.308893764 +0000 UTC m=+111.900157731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: W0311 12:00:24.862239 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e0b0c2_39e9_4aa5_934b_01abfe80d224.slice/crio-b8442c962c26f1276e3451ea967fc75e250fb66aedf5b429bd661544990e0297 WatchSource:0}: Error finding container b8442c962c26f1276e3451ea967fc75e250fb66aedf5b429bd661544990e0297: Status 404 returned error can't find the container with id b8442c962c26f1276e3451ea967fc75e250fb66aedf5b429bd661544990e0297 Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.864335 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" podStartSLOduration=64.86432045 podStartE2EDuration="1m4.86432045s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:24.863567108 +0000 UTC m=+111.454831075" watchObservedRunningTime="2026-03-11 12:00:24.86432045 +0000 UTC m=+111.455584417" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.865586 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" event={"ID":"da3678d7-b440-44bd-b73b-2b04f1225094","Type":"ContainerStarted","Data":"ab2bf8f3508720b262044c8ff124a051028143a2ec8ee0caa1baa21992774819"} Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.877950 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mws5d" event={"ID":"fd35e4e1-eb63-44a5-a8e3-376a87c20de2","Type":"ContainerStarted","Data":"9d428a31c3c8922e3578835c8520cb0c46d765311ca84626af2fdff60030e042"} Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.909051 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.909203 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.409177883 +0000 UTC m=+112.000441850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.909358 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.909667 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.409653717 +0000 UTC m=+112.000917684 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.930101 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" event={"ID":"f66d48af-027e-448b-9897-9f0c62fbd6c0","Type":"ContainerStarted","Data":"83ce7ed2e7cc7dbd752594375f677dfc68b6916731972c64d0ad92023c6e83ac"} Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.930172 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" event={"ID":"f66d48af-027e-448b-9897-9f0c62fbd6c0","Type":"ContainerStarted","Data":"119048a41fbdab7d4fa34dc3cbc49c3cc316f3d35e003b49f3dc4288bc4b06a9"} Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.966954 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" podStartSLOduration=64.966941296 podStartE2EDuration="1m4.966941296s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:24.965565957 +0000 UTC m=+111.556829924" watchObservedRunningTime="2026-03-11 12:00:24.966941296 +0000 UTC m=+111.558205263" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.967978 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6m5gg" event={"ID":"027b1711-77a0-4359-bd98-246217fdb5f8","Type":"ContainerStarted","Data":"420ade994713469a8d7dc3a9592eadb5bbdb58134bf2d223edb653287faf03e7"} Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.977989 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" event={"ID":"546d4851-e1c7-418b-8ba6-5847e5f9efde","Type":"ContainerStarted","Data":"bb301579c908efd9a833ba2c76294edf97abc1c238aa669d3b8696cb61fa9a56"} Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.993163 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" podStartSLOduration=64.993148366 podStartE2EDuration="1m4.993148366s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:24.991108938 +0000 UTC m=+111.582372905" watchObservedRunningTime="2026-03-11 12:00:24.993148366 +0000 UTC m=+111.584412333" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.997740 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" event={"ID":"3af1f0c3-1a92-49f9-beec-dff95561c5dd","Type":"ContainerStarted","Data":"c19a0c4d64f143322e7f9f2a7b2398cffbaaa91830a246a52660246604b33d86"} Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.999054 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" event={"ID":"0503425c-595f-4ff5-a7eb-c73168d939d5","Type":"ContainerStarted","Data":"17d4588ae41a0c39d95e19e5f26dfe7571e9fb623bf49b120095b8c404798267"} Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.000495 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" event={"ID":"17c97aa5-8179-41d7-adcb-c4da341f4cec","Type":"ContainerStarted","Data":"f917625c43b93079809067b0f348da470474b0641aee04be57abde0f618f2ab5"} Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.006371 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" event={"ID":"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527","Type":"ContainerStarted","Data":"c8662d9af8cd0a9090c14a1f5f335228a3b31c7b8b889472214083ecc6cbeaa4"} Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.006465 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-dh658 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.006533 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dh658" podUID="8c843417-3e01-48f9-b0b6-845fbbbf7eab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.010571 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.011742 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.511720327 +0000 UTC m=+112.102984294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.028608 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" podStartSLOduration=65.02858982 podStartE2EDuration="1m5.02858982s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:25.026730717 +0000 UTC m=+111.617994684" watchObservedRunningTime="2026-03-11 12:00:25.02858982 +0000 UTC m=+111.619853787" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.069552 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dh658" podStartSLOduration=65.069536032 podStartE2EDuration="1m5.069536032s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:25.066921357 +0000 UTC m=+111.658185334" watchObservedRunningTime="2026-03-11 12:00:25.069536032 +0000 UTC m=+111.660799999" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.112182 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.112617 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.612597854 +0000 UTC m=+112.203861821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.216478 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.217433 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.717404883 +0000 UTC m=+112.308668850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.218605 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.218979 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.718964457 +0000 UTC m=+112.310228424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.320598 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.321444 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.321927 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.821908193 +0000 UTC m=+112.413172160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.340451 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.389279 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54114: no serving certificate available for the kubelet" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.400106 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" podStartSLOduration=65.40008708 podStartE2EDuration="1m5.40008708s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:25.398509485 +0000 UTC m=+111.989773452" watchObservedRunningTime="2026-03-11 12:00:25.40008708 +0000 UTC m=+111.991351047" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.424934 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.425325 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.925312882 +0000 UTC m=+112.516576849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.466044 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" podStartSLOduration=65.466025587 podStartE2EDuration="1m5.466025587s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:25.432183238 +0000 UTC m=+112.023447205" watchObservedRunningTime="2026-03-11 12:00:25.466025587 +0000 UTC m=+112.057289554" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.468419 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-blgl4"] Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.469960 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc"] Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.525786 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.526238 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.026221299 +0000 UTC m=+112.617485266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.628027 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.628434 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.128422024 +0000 UTC m=+112.719685991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.719576 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=26.719538711 podStartE2EDuration="26.719538711s" podCreationTimestamp="2026-03-11 11:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:25.696969405 +0000 UTC m=+112.288233372" watchObservedRunningTime="2026-03-11 12:00:25.719538711 +0000 UTC m=+112.310802678" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.729995 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.730649 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.230631908 +0000 UTC m=+112.821895875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.831515 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.831908 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.331890876 +0000 UTC m=+112.923154903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.936939 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.937395 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.437380784 +0000 UTC m=+113.028644751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.993943 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" podStartSLOduration=65.993928372 podStartE2EDuration="1m5.993928372s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:25.992234304 +0000 UTC m=+112.583498281" watchObservedRunningTime="2026-03-11 12:00:25.993928372 +0000 UTC m=+112.585192339" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.011499 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" event={"ID":"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be","Type":"ContainerStarted","Data":"6f23f4498ef808f215d3f7b697aecea792a62bebef118c14d8004705f7301ea3"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.011556 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" event={"ID":"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be","Type":"ContainerStarted","Data":"76fdb003be2f42f321b1f84a7b3ff2f9c6c39157b9fe6fe69e9cbf0b8e78ef15"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.013949 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6m5gg" event={"ID":"027b1711-77a0-4359-bd98-246217fdb5f8","Type":"ContainerStarted","Data":"fa083920068ccd5688cbd379373974a2b066c0874558fb23ea37e5bac5a67363"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.015282 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" event={"ID":"b7e0b0c2-39e9-4aa5-934b-01abfe80d224","Type":"ContainerStarted","Data":"3bca950243eb3b2a32f38d585904143b6c89f6ff8470498644b40b590c645be8"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.015308 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" event={"ID":"b7e0b0c2-39e9-4aa5-934b-01abfe80d224","Type":"ContainerStarted","Data":"b8442c962c26f1276e3451ea967fc75e250fb66aedf5b429bd661544990e0297"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.016013 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.017145 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" event={"ID":"da3678d7-b440-44bd-b73b-2b04f1225094","Type":"ContainerStarted","Data":"c6bea91cb2e3b32b2163418ca593a45df0ed244d08d3e62ca6ba50f125fb7cb5"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.018470 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" event={"ID":"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527","Type":"ContainerStarted","Data":"ec44ea2166db26681bb3b2144354ad250006efdac888036b28dd71be6c8b4c11"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.023328 4816 patch_prober.go:28] interesting pod/console-operator-58897d9998-fxsjj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.023382 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" podUID="b7e0b0c2-39e9-4aa5-934b-01abfe80d224" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.024433 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" event={"ID":"0503425c-595f-4ff5-a7eb-c73168d939d5","Type":"ContainerStarted","Data":"0542d658e2694def8608b55ddc1d8f7873bd2dbfcd0ac44002f00d970538e265"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.025685 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mzkr9"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.025878 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-blgl4" event={"ID":"efc988f7-8a1a-4d22-b6bb-b2617c721017","Type":"ContainerStarted","Data":"95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.025898 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-blgl4" event={"ID":"efc988f7-8a1a-4d22-b6bb-b2617c721017","Type":"ContainerStarted","Data":"59a99708271969fdd60bd64b8768b6f0fa05af801e0f7d034beaae8d3d4be471"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.027002 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" event={"ID":"546d4851-e1c7-418b-8ba6-5847e5f9efde","Type":"ContainerStarted","Data":"94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.031740 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.038065 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.038439 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.538428835 +0000 UTC m=+113.129692802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.059938 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mws5d" event={"ID":"fd35e4e1-eb63-44a5-a8e3-376a87c20de2","Type":"ContainerStarted","Data":"59d2f81c6348a3e946f0356ffa9450b30abc22c2e7958ca8c06ed7e142d914bb"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.062319 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.084825 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" event={"ID":"3af1f0c3-1a92-49f9-beec-dff95561c5dd","Type":"ContainerStarted","Data":"f21741ee3ba7efac50919ebcbe24192aebd758333c328e95af61a037b6dd42f4"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.087830 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-dh658 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.087905 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dh658" podUID="8c843417-3e01-48f9-b0b6-845fbbbf7eab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.089462 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.139699 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.159216 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.659195431 +0000 UTC m=+113.250459398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.177327 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.199757 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.191773 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.224656 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.251012 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.256497 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.756483235 +0000 UTC m=+113.347747202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.302715 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.322135 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" podStartSLOduration=66.322118363 podStartE2EDuration="1m6.322118363s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.319983022 +0000 UTC m=+112.911246989" watchObservedRunningTime="2026-03-11 12:00:26.322118363 +0000 UTC m=+112.913382330" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.330348 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.358939 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.359424 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.85940845 +0000 UTC m=+113.450672417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: W0311 12:00:26.390582 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c040a86_9614_48cb_9df7_14c83b046dce.slice/crio-dbe4724e3bb10a60d2bcdde00ce0cce01eb1e6f17e7c5b379625a2f60d27762d WatchSource:0}: Error finding container dbe4724e3bb10a60d2bcdde00ce0cce01eb1e6f17e7c5b379625a2f60d27762d: Status 404 returned error can't find the container with id dbe4724e3bb10a60d2bcdde00ce0cce01eb1e6f17e7c5b379625a2f60d27762d Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.402079 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2ltv9"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.418562 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" podStartSLOduration=66.418542972 podStartE2EDuration="1m6.418542972s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.417847602 +0000 UTC m=+113.009111569" watchObservedRunningTime="2026-03-11 12:00:26.418542972 +0000 UTC m=+113.009806929" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.460083 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.460476 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.960462811 +0000 UTC m=+113.551726778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.461652 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-blgl4" podStartSLOduration=66.461636045 podStartE2EDuration="1m6.461636045s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.459842874 +0000 UTC m=+113.051106851" watchObservedRunningTime="2026-03-11 12:00:26.461636045 +0000 UTC m=+113.052900012" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.509562 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6m5gg" podStartSLOduration=66.509544926 podStartE2EDuration="1m6.509544926s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.507456256 +0000 UTC m=+113.098720223" watchObservedRunningTime="2026-03-11 12:00:26.509544926 +0000 UTC m=+113.100808893" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.510420 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.527334 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.548120 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-28g7h"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.561164 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.561514 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.061500083 +0000 UTC m=+113.652764050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.562913 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.653827 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.655200 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" podStartSLOduration=66.655179493 podStartE2EDuration="1m6.655179493s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.584720757 +0000 UTC m=+113.175984724" watchObservedRunningTime="2026-03-11 12:00:26.655179493 +0000 UTC m=+113.246443460" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.665199 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.668287 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.168269918 +0000 UTC m=+113.759533885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.696525 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wgxgk"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.723975 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.739533 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zln7t"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.752852 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.754789 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-mws5d" podStartSLOduration=5.754770373 podStartE2EDuration="5.754770373s" podCreationTimestamp="2026-03-11 12:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.752730554 +0000 UTC m=+113.343994521" watchObservedRunningTime="2026-03-11 12:00:26.754770373 +0000 UTC m=+113.346034340" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.767435 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54128: no serving certificate available for the kubelet" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.776115 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.776591 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.276574567 +0000 UTC m=+113.867838534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.827729 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" podStartSLOduration=66.82771432 podStartE2EDuration="1m6.82771432s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.825640131 +0000 UTC m=+113.416904098" watchObservedRunningTime="2026-03-11 12:00:26.82771432 +0000 UTC m=+113.418978287" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.842420 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8gcm4"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.851624 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tqt25"] Mar 11 12:00:26 crc kubenswrapper[4816]: W0311 12:00:26.853034 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod680978cb_e609_4292_827f_cc8a5b9c1438.slice/crio-f5975ccfec851804c1a3ba6815d3df83aae82adb150bc99b31c85cc254a88d97 WatchSource:0}: Error finding container f5975ccfec851804c1a3ba6815d3df83aae82adb150bc99b31c85cc254a88d97: Status 404 returned error can't find the container with id f5975ccfec851804c1a3ba6815d3df83aae82adb150bc99b31c85cc254a88d97 Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.853092 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.854973 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.858658 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.870218 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bb6wh"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.880401 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.880911 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.380899672 +0000 UTC m=+113.972163639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.919529 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" podStartSLOduration=66.919495236 podStartE2EDuration="1m6.919495236s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.912950619 +0000 UTC m=+113.504214586" watchObservedRunningTime="2026-03-11 12:00:26.919495236 +0000 UTC m=+113.510759203" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.981367 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.981866 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.4818467 +0000 UTC m=+114.073110667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: W0311 12:00:26.981957 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0acb833f_163a_47e1_8fb7_b9bc97b81fe1.slice/crio-e6b2a4078d81ca62b8207d6362521735fd66adf6d36d1b61a71c0e7dcd8d79d5 WatchSource:0}: Error finding container e6b2a4078d81ca62b8207d6362521735fd66adf6d36d1b61a71c0e7dcd8d79d5: Status 404 returned error can't find the container with id e6b2a4078d81ca62b8207d6362521735fd66adf6d36d1b61a71c0e7dcd8d79d5 Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.083539 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.083871 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.583859669 +0000 UTC m=+114.175123636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.090050 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" podStartSLOduration=67.090032436 podStartE2EDuration="1m7.090032436s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.089560823 +0000 UTC m=+113.680824790" watchObservedRunningTime="2026-03-11 12:00:27.090032436 +0000 UTC m=+113.681296403" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.103565 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" event={"ID":"4b4119d5-f1a1-4d09-83c6-da7decba9ab4","Type":"ContainerStarted","Data":"4716f35279b6c16f2fb81d600a0f46e378f55f57dc8771d2eeab60a0abb74dab"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.104616 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" event={"ID":"0acb833f-163a-47e1-8fb7-b9bc97b81fe1","Type":"ContainerStarted","Data":"e6b2a4078d81ca62b8207d6362521735fd66adf6d36d1b61a71c0e7dcd8d79d5"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.120916 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" event={"ID":"bb999b74-ac20-4e84-b2c7-b16906afbf06","Type":"ContainerStarted","Data":"54da1f4160bc4846c3fcb5e2cae8ad60f73f88af6d98fed0b3efebf2636bdcca"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.120959 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" event={"ID":"bb999b74-ac20-4e84-b2c7-b16906afbf06","Type":"ContainerStarted","Data":"db231c364b2be3f1307b4f97cf1b96dd6b5a6a88202b97d4333636332a49c671"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.121701 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" podStartSLOduration=67.121691952 podStartE2EDuration="1m7.121691952s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.121341482 +0000 UTC m=+113.712605449" watchObservedRunningTime="2026-03-11 12:00:27.121691952 +0000 UTC m=+113.712955919" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.128081 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgxgk" event={"ID":"1a17171b-c738-4862-a2a0-cbb09219322a","Type":"ContainerStarted","Data":"496699594abf773dc472d3e220c37a77b73baafcf4f43f773056457b28faa7e5"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.142613 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" event={"ID":"57df17b9-73f2-468a-8359-5a07f19a5493","Type":"ContainerStarted","Data":"16e9a23dc26af901821b3fa1119ea56533e472b0b7cc64924fa1a7c9b408adab"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.147368 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" podStartSLOduration=67.147351036 podStartE2EDuration="1m7.147351036s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.145609876 +0000 UTC m=+113.736873833" watchObservedRunningTime="2026-03-11 12:00:27.147351036 +0000 UTC m=+113.738615003" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.153286 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" event={"ID":"3c040a86-9614-48cb-9df7-14c83b046dce","Type":"ContainerStarted","Data":"f3bda5d4e49a815a926b2f32c60f3932a76a7181a017078bc20f79926bfbf6a6"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.153338 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" event={"ID":"3c040a86-9614-48cb-9df7-14c83b046dce","Type":"ContainerStarted","Data":"dbe4724e3bb10a60d2bcdde00ce0cce01eb1e6f17e7c5b379625a2f60d27762d"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.179827 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" podStartSLOduration=6.179808745 podStartE2EDuration="6.179808745s" podCreationTimestamp="2026-03-11 12:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.165450644 +0000 UTC m=+113.756714611" watchObservedRunningTime="2026-03-11 12:00:27.179808745 +0000 UTC m=+113.771072772" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.180388 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:27 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:27 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:27 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.180430 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.192355 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.197427 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.697402768 +0000 UTC m=+114.288666745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.202585 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" event={"ID":"750d6f55-7cf7-4376-8ead-6d481db69c2d","Type":"ContainerStarted","Data":"400513d5fe5aa4eb59bd7bf80a19508c62a58ad61159b9a9c51bf8cb9d9f7bf6"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.217538 4816 generic.go:334] "Generic (PLEG): container finished" podID="2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be" containerID="6f23f4498ef808f215d3f7b697aecea792a62bebef118c14d8004705f7301ea3" exitCode=0 Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.217611 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" event={"ID":"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be","Type":"ContainerDied","Data":"6f23f4498ef808f215d3f7b697aecea792a62bebef118c14d8004705f7301ea3"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.219518 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" event={"ID":"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4","Type":"ContainerStarted","Data":"4a6887167d206555e40cdc1b2ac3119254cfcc32db5af672ba93068c79875718"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.222848 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2ltv9" event={"ID":"1b74d12c-0a8c-48b1-9931-950ea6e20d4a","Type":"ContainerStarted","Data":"f68fbe554cc10cad015092a6618e898210fbdc88cbd226329328f1aeb3aaa473"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.222881 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2ltv9" event={"ID":"1b74d12c-0a8c-48b1-9931-950ea6e20d4a","Type":"ContainerStarted","Data":"bbc911364a81eb26f1c0447ebf813d8828856fa7dd7a0bd30bc6f870ce0705c8"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.224524 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" event={"ID":"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6","Type":"ContainerStarted","Data":"18da590f53c2a68db8ccc3639b30699431b029db82a4def3280157c1b87bba73"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.225582 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" podStartSLOduration=27.225555254 podStartE2EDuration="27.225555254s" podCreationTimestamp="2026-03-11 12:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.21283774 +0000 UTC m=+113.804101707" watchObservedRunningTime="2026-03-11 12:00:27.225555254 +0000 UTC m=+113.816819221" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.260175 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" event={"ID":"680978cb-e609-4292-827f-cc8a5b9c1438","Type":"ContainerStarted","Data":"f5975ccfec851804c1a3ba6815d3df83aae82adb150bc99b31c85cc254a88d97"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.288142 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" event={"ID":"9a782b5b-9eac-4b5b-8ca8-751111b2459b","Type":"ContainerStarted","Data":"9ef7a7bd1419b2085d1576d9a7b67686bdd53f8154c42e3d42c23a1ce00008f3"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.297185 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.298381 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.798370467 +0000 UTC m=+114.389634434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.298623 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" event={"ID":"67dd48ce-6361-442d-9552-f06346e4d8d4","Type":"ContainerStarted","Data":"d53c0f93e8679b10c3ffcfb6e3d167bda109cd94e97318cb36ef5707be22a822"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.301377 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.301421 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" event={"ID":"7aff6a5d-2a66-4ab5-ad53-878f5fea4115","Type":"ContainerStarted","Data":"34b11f77470b1637a3485ad195e041ef69dc18d563a73ae4582797cb787baa83"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.301446 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.335841 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" event={"ID":"ba5682ea-6a62-4983-b525-5dc9612ad46d","Type":"ContainerStarted","Data":"5536eee87ca3b26c68c1a05aab8edad334cd285012e57c031d6889b3862a5654"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.357499 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.385599 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" event={"ID":"2eaac3e7-6f80-47da-a6c7-e415a0b8edbd","Type":"ContainerStarted","Data":"0b5288ecacf11c9a7b38d572188af36741fc424b7fa7a544bd248cb7b4083cfa"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.385723 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" event={"ID":"2eaac3e7-6f80-47da-a6c7-e415a0b8edbd","Type":"ContainerStarted","Data":"c4e5735cb75a4c02b5b1e71d4c701f4a1cb48064e14db1dbd61397d5c1842e47"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.395433 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2ltv9" podStartSLOduration=6.395416474 podStartE2EDuration="6.395416474s" podCreationTimestamp="2026-03-11 12:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.322432896 +0000 UTC m=+113.913696863" watchObservedRunningTime="2026-03-11 12:00:27.395416474 +0000 UTC m=+113.986680441" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.401103 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.401164 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.901148468 +0000 UTC m=+114.492412435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.403368 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.403773 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.903755873 +0000 UTC m=+114.495019830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.421530 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" event={"ID":"4b9c2804-ee65-4a09-9985-d2345aa7f82a","Type":"ContainerStarted","Data":"311ac96d4d356a7358644c9e2578d6c1fa20ef45622ac37069872540965f4bb7"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.430749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" event={"ID":"00d6d506-7c84-4fef-9dc9-85f855533c06","Type":"ContainerStarted","Data":"1dcb8225761fe22e5903f2fc7e08b1eeb5c154530d31e4174f16177fd54aca84"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.461544 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" event={"ID":"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0","Type":"ContainerStarted","Data":"625e9f5d0b2fd0bfae955ffd2e1aa115be2eaf2540a1f99a6a612612eed4193f"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.461594 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" event={"ID":"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0","Type":"ContainerStarted","Data":"2d4bfe343a06974e898e59211c3213d04af9d8bda390419771b22629d9ef366a"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.462694 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.467022 4816 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-9znd7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.467109 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" podUID="4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.512682 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.513374 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.013359129 +0000 UTC m=+114.604623096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.514134 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" event={"ID":"db49f265-44d3-468b-8e2f-2246b02b57be","Type":"ContainerStarted","Data":"9de55b662b2bf4787bdafff13f7749488700cec65a809df12b9b8647839057ea"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.514168 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" event={"ID":"db49f265-44d3-468b-8e2f-2246b02b57be","Type":"ContainerStarted","Data":"214e8fc54c8b54257321732922c5740805d1f1b722e1e4edbb6c4d1803062fd4"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.529806 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" event={"ID":"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea","Type":"ContainerStarted","Data":"20fd568f45bd6095124b7be4e165bad17936ef91eff1dd848b5ee022ef1f11e5"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.529851 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" event={"ID":"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea","Type":"ContainerStarted","Data":"da70fe91baa5d29aa10ff0569fbce93a2f818c73f5f48d81b341eb82840a9409"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.532491 4816 patch_prober.go:28] interesting pod/console-operator-58897d9998-fxsjj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.532549 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" podUID="b7e0b0c2-39e9-4aa5-934b-01abfe80d224" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.545748 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" podStartSLOduration=67.545728205 podStartE2EDuration="1m7.545728205s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.534469813 +0000 UTC m=+114.125733780" watchObservedRunningTime="2026-03-11 12:00:27.545728205 +0000 UTC m=+114.136992172" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.546278 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" podStartSLOduration=67.546272451 podStartE2EDuration="1m7.546272451s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.511945809 +0000 UTC m=+114.103209776" watchObservedRunningTime="2026-03-11 12:00:27.546272451 +0000 UTC m=+114.137536418" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.560546 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.610185 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.610447 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.616361 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.624435 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.124405866 +0000 UTC m=+114.715669913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.722516 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.723176 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.223157942 +0000 UTC m=+114.814421919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.827632 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.827993 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.327973331 +0000 UTC m=+114.919237298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.886819 4816 patch_prober.go:28] interesting pod/apiserver-76f77b778f-pjsgk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]log ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]etcd ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/generic-apiserver-start-informers ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/max-in-flight-filter ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 11 12:00:27 crc kubenswrapper[4816]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 11 12:00:27 crc kubenswrapper[4816]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/project.openshift.io-projectcache ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/openshift.io-startinformers ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 11 12:00:27 crc kubenswrapper[4816]: livez check failed Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.887111 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" podUID="3af1f0c3-1a92-49f9-beec-dff95561c5dd" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.922103 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6bx5p"] Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.929040 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.929523 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.429490776 +0000 UTC m=+115.020754743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.030214 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.030523 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.530510466 +0000 UTC m=+115.121774433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.131224 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.131436 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.631419353 +0000 UTC m=+115.222683320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.131640 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.131965 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.631953748 +0000 UTC m=+115.223217715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.168591 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:28 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:28 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:28 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.168651 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.232434 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.232728 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.732702831 +0000 UTC m=+115.323966798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.232884 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.233196 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.733182985 +0000 UTC m=+115.324446952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.333821 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.333976 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.833946778 +0000 UTC m=+115.425210755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.334034 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.334119 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.334399 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.834388171 +0000 UTC m=+115.425652138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.342871 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.435562 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.435763 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.935738601 +0000 UTC m=+115.527002568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.435888 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.436242 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.936229375 +0000 UTC m=+115.527493342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.503902 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.536837 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.537480 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.037461282 +0000 UTC m=+115.628725239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.562267 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" event={"ID":"00d6d506-7c84-4fef-9dc9-85f855533c06","Type":"ContainerStarted","Data":"e818fd582cfa1c624d153ed346d8d2561dc858f09a93c614502316139405f7df"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.563524 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.571678 4816 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vll2h container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.571733 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" podUID="00d6d506-7c84-4fef-9dc9-85f855533c06" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.598311 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" podStartSLOduration=68.598298132 podStartE2EDuration="1m8.598298132s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.596643525 +0000 UTC m=+115.187907492" watchObservedRunningTime="2026-03-11 12:00:28.598298132 +0000 UTC m=+115.189562099" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.604174 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" event={"ID":"680978cb-e609-4292-827f-cc8a5b9c1438","Type":"ContainerStarted","Data":"d8e0162229bd3db73cb4b0eeb62e87748a2ab24c3c120f1ce05953a811943f3a"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.604215 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" event={"ID":"680978cb-e609-4292-827f-cc8a5b9c1438","Type":"ContainerStarted","Data":"b0aa7d78792eb323f1a29568eaf840ba018bb032b7729b69eab01f3e53bcb74f"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.625185 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" podStartSLOduration=68.625170451 podStartE2EDuration="1m8.625170451s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.623481613 +0000 UTC m=+115.214745580" watchObservedRunningTime="2026-03-11 12:00:28.625170451 +0000 UTC m=+115.216434418" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.625707 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" event={"ID":"4b4119d5-f1a1-4d09-83c6-da7decba9ab4","Type":"ContainerStarted","Data":"addfd347014ed52df7794035d7b9a4debe54f61c6d6f028766184333e5e1dc2c"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.625749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" event={"ID":"4b4119d5-f1a1-4d09-83c6-da7decba9ab4","Type":"ContainerStarted","Data":"f32ffe01e1a0fcc6604baceed20d820272e5a68932e345d2492cce4a42a677b4"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.640074 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.642332 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.142319322 +0000 UTC m=+115.733583289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.646982 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" event={"ID":"2eaac3e7-6f80-47da-a6c7-e415a0b8edbd","Type":"ContainerStarted","Data":"4f8b0832076daae74be53cca6035926e435c7c3a7f318de1ef6e3199667b54b2"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.658824 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" podStartSLOduration=68.658809234 podStartE2EDuration="1m8.658809234s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.657000212 +0000 UTC m=+115.248264179" watchObservedRunningTime="2026-03-11 12:00:28.658809234 +0000 UTC m=+115.250073201" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.671106 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" event={"ID":"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4","Type":"ContainerStarted","Data":"7ce3d816fcccc907bf29f351448d8897be42efa04eaa3896a9e32c00c6103b6f"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.693943 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" event={"ID":"0acb833f-163a-47e1-8fb7-b9bc97b81fe1","Type":"ContainerStarted","Data":"5b923cb3ffe100c411947fc416ed4c455c8878af0d16f2650fa92108bbb1f053"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.696079 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" event={"ID":"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6","Type":"ContainerStarted","Data":"8656da7afe12612a591590b5842a75afd40668d9dd72d7b01fcb55c35787a0e1"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.696721 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.697783 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" event={"ID":"67dd48ce-6361-442d-9552-f06346e4d8d4","Type":"ContainerStarted","Data":"fd6b49190e602fec1028c6aa848dd26559f95b874b7a6531d2fd8b5cd2571187"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.699316 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" event={"ID":"7aff6a5d-2a66-4ab5-ad53-878f5fea4115","Type":"ContainerStarted","Data":"0bede3b68dc0f0441392fc57baf54d9c3af3b0d7760a2f7b93b4cb6aeab6c1ef"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.699348 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" event={"ID":"7aff6a5d-2a66-4ab5-ad53-878f5fea4115","Type":"ContainerStarted","Data":"a8d8db370a9b41f32c69ce59576c35b65b768e3a327b6f66e777f1f0e6393cc9"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.699795 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.701365 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" event={"ID":"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea","Type":"ContainerStarted","Data":"7db71036cde4280388b996ebdddbfc73d6c04f34960ba91294120ca7c453323a"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.714757 4816 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8gcm4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.714813 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.722142 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" podStartSLOduration=68.722126306 podStartE2EDuration="1m8.722126306s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.720654934 +0000 UTC m=+115.311918901" watchObservedRunningTime="2026-03-11 12:00:28.722126306 +0000 UTC m=+115.313390273" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.723281 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" podStartSLOduration=68.723275149 podStartE2EDuration="1m8.723275149s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.694710491 +0000 UTC m=+115.285974458" watchObservedRunningTime="2026-03-11 12:00:28.723275149 +0000 UTC m=+115.314539116" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.734314 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" event={"ID":"bb999b74-ac20-4e84-b2c7-b16906afbf06","Type":"ContainerStarted","Data":"854754aef75c5dd4fda6c4b5f198214b9610019973f5ebb2ae801d50d4cf7929"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.741393 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.741632 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.241604173 +0000 UTC m=+115.832868130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.741725 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.743201 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.243185808 +0000 UTC m=+115.834449865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.747303 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" event={"ID":"57df17b9-73f2-468a-8359-5a07f19a5493","Type":"ContainerStarted","Data":"0475dc3f14cdd309eba8548944ac45b9a037b6703c4f0cfe3f54656702d6fdde"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.754576 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgxgk" event={"ID":"1a17171b-c738-4862-a2a0-cbb09219322a","Type":"ContainerStarted","Data":"cb948f85958844a1fa9a2001754abbfab8ccf9b60a31973e68c3b5136fc4cf7f"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.754636 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgxgk" event={"ID":"1a17171b-c738-4862-a2a0-cbb09219322a","Type":"ContainerStarted","Data":"9690d0e3275c28410c9a526af6e05325e0184ce4b4f91da89997623600412eaf"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.755292 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.756851 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" podStartSLOduration=68.756834109 podStartE2EDuration="1m8.756834109s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.754183733 +0000 UTC m=+115.345447700" watchObservedRunningTime="2026-03-11 12:00:28.756834109 +0000 UTC m=+115.348098076" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.762779 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" event={"ID":"750d6f55-7cf7-4376-8ead-6d481db69c2d","Type":"ContainerStarted","Data":"8a2503a1dd8cbc73f797464d5558e356105fc109fc34860490050a8dca4e4e5e"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.763829 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.774723 4816 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-256s6 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.774782 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" podUID="750d6f55-7cf7-4376-8ead-6d481db69c2d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.775788 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" event={"ID":"9a782b5b-9eac-4b5b-8ca8-751111b2459b","Type":"ContainerStarted","Data":"e23b0d7be7a104e344a22c0c1f176758057c7f2b7edb7f10d52760db7c90e4e7"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.797355 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" event={"ID":"4b9c2804-ee65-4a09-9985-d2345aa7f82a","Type":"ContainerStarted","Data":"5fbf9043b40ffe58902cbb5afea9100285e835fb24b1122454ed3494446e1a5d"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.800838 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" podStartSLOduration=68.800799907 podStartE2EDuration="1m8.800799907s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.778855099 +0000 UTC m=+115.370119066" watchObservedRunningTime="2026-03-11 12:00:28.800799907 +0000 UTC m=+115.392063874" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.801278 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" podStartSLOduration=68.80127205 podStartE2EDuration="1m8.80127205s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.800456797 +0000 UTC m=+115.391720754" watchObservedRunningTime="2026-03-11 12:00:28.80127205 +0000 UTC m=+115.392536017" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.812550 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" event={"ID":"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be","Type":"ContainerStarted","Data":"c6d542e698cd1c920cc240a3c13d380658d0156d62a503c40e848230eda9b6e3"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.812725 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.824412 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" podStartSLOduration=68.824395882 podStartE2EDuration="1m8.824395882s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.823718313 +0000 UTC m=+115.414982280" watchObservedRunningTime="2026-03-11 12:00:28.824395882 +0000 UTC m=+115.415659849" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.825585 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.840144 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.844300 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.844457 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.344439665 +0000 UTC m=+115.935703632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.844602 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.847060 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.34705034 +0000 UTC m=+115.938314307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.869408 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" podStartSLOduration=68.869376309 podStartE2EDuration="1m8.869376309s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.860544056 +0000 UTC m=+115.451808013" watchObservedRunningTime="2026-03-11 12:00:28.869376309 +0000 UTC m=+115.460640276" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.889800 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" podStartSLOduration=68.889783703 podStartE2EDuration="1m8.889783703s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.888314671 +0000 UTC m=+115.479578638" watchObservedRunningTime="2026-03-11 12:00:28.889783703 +0000 UTC m=+115.481047670" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.922016 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" podStartSLOduration=68.921999335 podStartE2EDuration="1m8.921999335s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.920829711 +0000 UTC m=+115.512093678" watchObservedRunningTime="2026-03-11 12:00:28.921999335 +0000 UTC m=+115.513263302" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.945775 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.945996 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.946059 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.946146 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.946347 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.947124 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.447110683 +0000 UTC m=+116.038374650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.957352 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.959125 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.964916 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.976534 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.986327 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nv429"] Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.986576 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" podUID="564c2921-e9eb-4a24-a5b7-1a8471d1586b" containerName="controller-manager" containerID="cri-o://35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b" gracePeriod=30 Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.988824 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" podStartSLOduration=68.988803266 podStartE2EDuration="1m8.988803266s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.981322012 +0000 UTC m=+115.572585979" watchObservedRunningTime="2026-03-11 12:00:28.988803266 +0000 UTC m=+115.580067233" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.993837 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr"] Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.994015 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" podUID="ef1d29fc-f278-4f20-8362-3c406634d8ff" containerName="route-controller-manager" containerID="cri-o://067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1" gracePeriod=30 Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.047971 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.048424 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.548408112 +0000 UTC m=+116.139672079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.109663 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tt4rv"] Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.135495 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.141716 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.148876 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.149202 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.649184965 +0000 UTC m=+116.240448932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.158184 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.170418 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:29 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:29 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:29 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.170472 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:29 crc kubenswrapper[4816]: W0311 12:00:29.170687 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91b59d67_b771_4a57_b2a8_84303ec4d9bd.slice/crio-1fa2bca17353aa45b616c5d370c66b2256c1be892bee9030feaf0182219847a5 WatchSource:0}: Error finding container 1fa2bca17353aa45b616c5d370c66b2256c1be892bee9030feaf0182219847a5: Status 404 returned error can't find the container with id 1fa2bca17353aa45b616c5d370c66b2256c1be892bee9030feaf0182219847a5 Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.178720 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wgxgk" podStartSLOduration=8.1787037 podStartE2EDuration="8.1787037s" podCreationTimestamp="2026-03-11 12:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:29.145773898 +0000 UTC m=+115.737037865" watchObservedRunningTime="2026-03-11 12:00:29.1787037 +0000 UTC m=+115.769967657" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.178845 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" podStartSLOduration=69.178842794 podStartE2EDuration="1m9.178842794s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:29.173553223 +0000 UTC m=+115.764817180" watchObservedRunningTime="2026-03-11 12:00:29.178842794 +0000 UTC m=+115.770106761" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.225020 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" podStartSLOduration=69.225000665 podStartE2EDuration="1m9.225000665s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:29.221737661 +0000 UTC m=+115.813001629" watchObservedRunningTime="2026-03-11 12:00:29.225000665 +0000 UTC m=+115.816264632" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.259154 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.259467 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.759456081 +0000 UTC m=+116.350720048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.260631 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" podStartSLOduration=69.260616604 podStartE2EDuration="1m9.260616604s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:29.258122943 +0000 UTC m=+115.849386910" watchObservedRunningTime="2026-03-11 12:00:29.260616604 +0000 UTC m=+115.851880571" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.362512 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.362868 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.862854199 +0000 UTC m=+116.454118166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.429501 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54132: no serving certificate available for the kubelet" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.463967 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.464376 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.964364514 +0000 UTC m=+116.555628481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.568547 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.569137 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.069123272 +0000 UTC m=+116.660387239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.671008 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.671309 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.171297775 +0000 UTC m=+116.762561742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.709056 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:29 crc kubenswrapper[4816]: W0311 12:00:29.767169 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-c7c215c47e2ebf54d6c6dd174e46aaf1963e96116392bf7a714f25e370ba78ce WatchSource:0}: Error finding container c7c215c47e2ebf54d6c6dd174e46aaf1963e96116392bf7a714f25e370ba78ce: Status 404 returned error can't find the container with id c7c215c47e2ebf54d6c6dd174e46aaf1963e96116392bf7a714f25e370ba78ce Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.771456 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.771561 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1d29fc-f278-4f20-8362-3c406634d8ff-serving-cert\") pod \"ef1d29fc-f278-4f20-8362-3c406634d8ff\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.771604 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-config\") pod \"ef1d29fc-f278-4f20-8362-3c406634d8ff\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.771632 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-client-ca\") pod \"ef1d29fc-f278-4f20-8362-3c406634d8ff\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.771685 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzblg\" (UniqueName: \"kubernetes.io/projected/ef1d29fc-f278-4f20-8362-3c406634d8ff-kube-api-access-fzblg\") pod \"ef1d29fc-f278-4f20-8362-3c406634d8ff\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.774722 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.274695444 +0000 UTC m=+116.865959411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.779053 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-config" (OuterVolumeSpecName: "config") pod "ef1d29fc-f278-4f20-8362-3c406634d8ff" (UID: "ef1d29fc-f278-4f20-8362-3c406634d8ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.779728 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.781018 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-client-ca" (OuterVolumeSpecName: "client-ca") pod "ef1d29fc-f278-4f20-8362-3c406634d8ff" (UID: "ef1d29fc-f278-4f20-8362-3c406634d8ff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.804795 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1d29fc-f278-4f20-8362-3c406634d8ff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ef1d29fc-f278-4f20-8362-3c406634d8ff" (UID: "ef1d29fc-f278-4f20-8362-3c406634d8ff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.806046 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1d29fc-f278-4f20-8362-3c406634d8ff-kube-api-access-fzblg" (OuterVolumeSpecName: "kube-api-access-fzblg") pod "ef1d29fc-f278-4f20-8362-3c406634d8ff" (UID: "ef1d29fc-f278-4f20-8362-3c406634d8ff"). InnerVolumeSpecName "kube-api-access-fzblg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.872851 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-proxy-ca-bundles\") pod \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873162 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-config\") pod \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873206 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-client-ca\") pod \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873266 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564c2921-e9eb-4a24-a5b7-1a8471d1586b-serving-cert\") pod \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873326 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw5bq\" (UniqueName: \"kubernetes.io/projected/564c2921-e9eb-4a24-a5b7-1a8471d1586b-kube-api-access-xw5bq\") pod \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873637 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873692 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873707 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzblg\" (UniqueName: \"kubernetes.io/projected/ef1d29fc-f278-4f20-8362-3c406634d8ff-kube-api-access-fzblg\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873720 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1d29fc-f278-4f20-8362-3c406634d8ff-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873733 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.874000 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.373986625 +0000 UTC m=+116.965250592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.874310 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-config" (OuterVolumeSpecName: "config") pod "564c2921-e9eb-4a24-a5b7-1a8471d1586b" (UID: "564c2921-e9eb-4a24-a5b7-1a8471d1586b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.874715 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tt4rv" event={"ID":"91b59d67-b771-4a57-b2a8-84303ec4d9bd","Type":"ContainerStarted","Data":"6040a1d894051c4055c719c0a52de1f81cd096adc443a2192b997078f04868fa"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.874755 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tt4rv" event={"ID":"91b59d67-b771-4a57-b2a8-84303ec4d9bd","Type":"ContainerStarted","Data":"1fa2bca17353aa45b616c5d370c66b2256c1be892bee9030feaf0182219847a5"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.875416 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "564c2921-e9eb-4a24-a5b7-1a8471d1586b" (UID: "564c2921-e9eb-4a24-a5b7-1a8471d1586b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.882650 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-client-ca" (OuterVolumeSpecName: "client-ca") pod "564c2921-e9eb-4a24-a5b7-1a8471d1586b" (UID: "564c2921-e9eb-4a24-a5b7-1a8471d1586b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.893884 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564c2921-e9eb-4a24-a5b7-1a8471d1586b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "564c2921-e9eb-4a24-a5b7-1a8471d1586b" (UID: "564c2921-e9eb-4a24-a5b7-1a8471d1586b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.897441 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564c2921-e9eb-4a24-a5b7-1a8471d1586b-kube-api-access-xw5bq" (OuterVolumeSpecName: "kube-api-access-xw5bq") pod "564c2921-e9eb-4a24-a5b7-1a8471d1586b" (UID: "564c2921-e9eb-4a24-a5b7-1a8471d1586b"). InnerVolumeSpecName "kube-api-access-xw5bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.904486 4816 generic.go:334] "Generic (PLEG): container finished" podID="564c2921-e9eb-4a24-a5b7-1a8471d1586b" containerID="35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b" exitCode=0 Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.904573 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" event={"ID":"564c2921-e9eb-4a24-a5b7-1a8471d1586b","Type":"ContainerDied","Data":"35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.904605 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" event={"ID":"564c2921-e9eb-4a24-a5b7-1a8471d1586b","Type":"ContainerDied","Data":"306382581adac0ac9b7eb96a682fee969c6c0324fd34514acd435886ca5bcb46"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.904621 4816 scope.go:117] "RemoveContainer" containerID="35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.904732 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.950526 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c7c215c47e2ebf54d6c6dd174e46aaf1963e96116392bf7a714f25e370ba78ce"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.971406 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" event={"ID":"ba5682ea-6a62-4983-b525-5dc9612ad46d","Type":"ContainerStarted","Data":"4b1b82ea95db10b44cfdd3575432186e33d5528e7acf08d32f9607876280b08f"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.975265 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.975537 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.975548 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.975571 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.975580 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564c2921-e9eb-4a24-a5b7-1a8471d1586b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.975589 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw5bq\" (UniqueName: \"kubernetes.io/projected/564c2921-e9eb-4a24-a5b7-1a8471d1586b-kube-api-access-xw5bq\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.975662 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.475631813 +0000 UTC m=+117.066895780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.980199 4816 scope.go:117] "RemoveContainer" containerID="35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.992770 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nv429"] Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.993005 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b\": container with ID starting with 35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b not found: ID does not exist" containerID="35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.993045 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b"} err="failed to get container status \"35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b\": rpc error: code = NotFound desc = could not find container \"35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b\": container with ID starting with 35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b not found: ID does not exist" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.993204 4816 generic.go:334] "Generic (PLEG): container finished" podID="ef1d29fc-f278-4f20-8362-3c406634d8ff" containerID="067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1" exitCode=0 Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.995080 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.995099 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" event={"ID":"ef1d29fc-f278-4f20-8362-3c406634d8ff","Type":"ContainerDied","Data":"067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.995124 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nv429"] Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.995140 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" event={"ID":"ef1d29fc-f278-4f20-8362-3c406634d8ff","Type":"ContainerDied","Data":"095fa56b3beb4f734f86a3746d97623146bfffe930c63a78d60c59c578ed0242"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.995154 4816 scope.go:117] "RemoveContainer" containerID="067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.998150 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" gracePeriod=30 Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.014066 4816 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8gcm4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.014115 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.017393 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.070380 4816 scope.go:117] "RemoveContainer" containerID="067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.083544 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.085679 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.585664272 +0000 UTC m=+117.176928239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.108046 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr"] Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.108616 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1\": container with ID starting with 067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1 not found: ID does not exist" containerID="067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.108660 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1"} err="failed to get container status \"067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1\": rpc error: code = NotFound desc = could not find container \"067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1\": container with ID starting with 067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1 not found: ID does not exist" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.110310 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr"] Mar 11 12:00:30 crc kubenswrapper[4816]: W0311 12:00:30.131376 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-2b95c9ae7fa4598a347a3a69f4fa35abe4a733441a2b058dc15b0ea079136ebd WatchSource:0}: Error finding container 2b95c9ae7fa4598a347a3a69f4fa35abe4a733441a2b058dc15b0ea079136ebd: Status 404 returned error can't find the container with id 2b95c9ae7fa4598a347a3a69f4fa35abe4a733441a2b058dc15b0ea079136ebd Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.181056 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:30 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:30 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:30 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.181099 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.181802 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564c2921-e9eb-4a24-a5b7-1a8471d1586b" path="/var/lib/kubelet/pods/564c2921-e9eb-4a24-a5b7-1a8471d1586b/volumes" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.182356 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef1d29fc-f278-4f20-8362-3c406634d8ff" path="/var/lib/kubelet/pods/ef1d29fc-f278-4f20-8362-3c406634d8ff/volumes" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.185375 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.185660 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.685634312 +0000 UTC m=+117.276898279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.185968 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.186488 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.686472606 +0000 UTC m=+117.277736573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.287609 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.287775 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.787748454 +0000 UTC m=+117.379012421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.288147 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.288563 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.788545307 +0000 UTC m=+117.379809274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.388728 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.388910 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.888884938 +0000 UTC m=+117.480148905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.389949 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.390293 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.890284668 +0000 UTC m=+117.481548635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.425752 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.495276 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.495501 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.995470398 +0000 UTC m=+117.586734375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.496037 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.496472 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.996463176 +0000 UTC m=+117.587727143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.597584 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.597685 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.097670132 +0000 UTC m=+117.688934099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.597847 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.598116 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.098107525 +0000 UTC m=+117.689371492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.642302 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9fv28"] Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.642549 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1d29fc-f278-4f20-8362-3c406634d8ff" containerName="route-controller-manager" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.642569 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1d29fc-f278-4f20-8362-3c406634d8ff" containerName="route-controller-manager" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.642591 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564c2921-e9eb-4a24-a5b7-1a8471d1586b" containerName="controller-manager" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.642600 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="564c2921-e9eb-4a24-a5b7-1a8471d1586b" containerName="controller-manager" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.642717 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="564c2921-e9eb-4a24-a5b7-1a8471d1586b" containerName="controller-manager" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.642741 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1d29fc-f278-4f20-8362-3c406634d8ff" containerName="route-controller-manager" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.643583 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.646181 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.694433 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9fv28"] Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.699013 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.699214 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-utilities\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.699275 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-catalog-content\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.699321 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz9fg\" (UniqueName: \"kubernetes.io/projected/8d6e662d-8633-4e55-baf3-50a2c4d179a1-kube-api-access-fz9fg\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.699418 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.199402993 +0000 UTC m=+117.790666960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.800940 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz9fg\" (UniqueName: \"kubernetes.io/projected/8d6e662d-8633-4e55-baf3-50a2c4d179a1-kube-api-access-fz9fg\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.801016 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-utilities\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.801071 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-catalog-content\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.801114 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.801469 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.301454893 +0000 UTC m=+117.892718860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.801543 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-utilities\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.801622 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-catalog-content\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.838838 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz9fg\" (UniqueName: \"kubernetes.io/projected/8d6e662d-8633-4e55-baf3-50a2c4d179a1-kube-api-access-fz9fg\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.844604 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jwq6f"] Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.845746 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.846993 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.854532 4816 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.860862 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jwq6f"] Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.901684 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.901837 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xchpf\" (UniqueName: \"kubernetes.io/projected/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-kube-api-access-xchpf\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.901920 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-catalog-content\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.901961 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-utilities\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.902064 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.402049322 +0000 UTC m=+117.993313279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.960546 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.003078 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-catalog-content\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.003147 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-utilities\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.003195 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xchpf\" (UniqueName: \"kubernetes.io/projected/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-kube-api-access-xchpf\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.003225 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:31 crc kubenswrapper[4816]: E0311 12:00:31.003523 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.503511705 +0000 UTC m=+118.094775672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.004014 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-catalog-content\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.004211 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-utilities\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.009907 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tt4rv" event={"ID":"91b59d67-b771-4a57-b2a8-84303ec4d9bd","Type":"ContainerStarted","Data":"108155b2e0d568f79222b0c35c65ac8628e7eff006cbd9a71937c52d317b6c79"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.011556 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f536214d2d7e45a70f2551a23896f1f27009e74b06c936b5f1274110830510f7"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.011584 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"93b0d41aa9711271966eea402d33238cc42a3c444f85d7a083746c26838ae715"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.019340 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"275927a2ce16db15a1f7379ebd602e23fb3f5b46bb7a7ad8b9739ad525d8b6c5"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.019486 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.022146 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" event={"ID":"ba5682ea-6a62-4983-b525-5dc9612ad46d","Type":"ContainerStarted","Data":"c499589a765355dee4120b42d30d815f1b0331b591fde949ecf5e9b984eb905f"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.022172 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" event={"ID":"ba5682ea-6a62-4983-b525-5dc9612ad46d","Type":"ContainerStarted","Data":"7d45b1c0f4524e501322e6b17e727c0a896418327d0a26e8846d2bf9ac2ae2c7"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.026242 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"967c9268c14141b6c3f2c9f2dc4498d9ae6f96d221f70dbf1c7dc1457f590425"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.026322 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2b95c9ae7fa4598a347a3a69f4fa35abe4a733441a2b058dc15b0ea079136ebd"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.027657 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tt4rv" podStartSLOduration=71.027642685 podStartE2EDuration="1m11.027642685s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:31.025963567 +0000 UTC m=+117.617227534" watchObservedRunningTime="2026-03-11 12:00:31.027642685 +0000 UTC m=+117.618906662" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.036269 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.044290 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s2dh2"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.044935 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xchpf\" (UniqueName: \"kubernetes.io/projected/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-kube-api-access-xchpf\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.045228 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.056471 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.057267 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.057536 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f7578748c-p527z"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.058118 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.064013 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2dh2"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.066719 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.066750 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.066896 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.066934 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.068424 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.069650 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.069756 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.069878 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.070065 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.070201 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.074563 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.078528 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.082867 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f7578748c-p527z"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.092072 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.102872 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.106972 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107200 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-catalog-content\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107238 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsxwz\" (UniqueName: \"kubernetes.io/projected/756dd25b-5375-48bc-8578-a9585ef49e6c-kube-api-access-vsxwz\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107276 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gmfx\" (UniqueName: \"kubernetes.io/projected/1d5c9149-6a85-4e50-9569-6cc828e55a11-kube-api-access-2gmfx\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107381 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c904faa8-338a-4f9c-80fc-bad9d60139a0-serving-cert\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107400 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-config\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107420 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-utilities\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107434 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-client-ca\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107461 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5c9149-6a85-4e50-9569-6cc828e55a11-serving-cert\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107489 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-client-ca\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107524 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-proxy-ca-bundles\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107552 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-config\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107586 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpvql\" (UniqueName: \"kubernetes.io/projected/c904faa8-338a-4f9c-80fc-bad9d60139a0-kube-api-access-gpvql\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: E0311 12:00:31.108355 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.608341245 +0000 UTC m=+118.199605212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.154952 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.172632 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:31 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:31 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:31 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.172707 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.183095 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209492 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209533 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-catalog-content\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209554 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsxwz\" (UniqueName: \"kubernetes.io/projected/756dd25b-5375-48bc-8578-a9585ef49e6c-kube-api-access-vsxwz\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209572 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gmfx\" (UniqueName: \"kubernetes.io/projected/1d5c9149-6a85-4e50-9569-6cc828e55a11-kube-api-access-2gmfx\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209598 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c904faa8-338a-4f9c-80fc-bad9d60139a0-serving-cert\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209614 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-config\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209631 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-utilities\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209647 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-client-ca\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209664 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5c9149-6a85-4e50-9569-6cc828e55a11-serving-cert\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209682 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-client-ca\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209709 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-proxy-ca-bundles\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209740 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-config\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209769 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpvql\" (UniqueName: \"kubernetes.io/projected/c904faa8-338a-4f9c-80fc-bad9d60139a0-kube-api-access-gpvql\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.210533 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-utilities\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.211232 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-client-ca\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.211481 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-config\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: E0311 12:00:31.211755 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.711742353 +0000 UTC m=+118.303006320 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.212113 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-catalog-content\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.230891 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-config\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.234538 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-proxy-ca-bundles\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.235169 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c904faa8-338a-4f9c-80fc-bad9d60139a0-serving-cert\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.240609 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-client-ca\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.248138 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpvql\" (UniqueName: \"kubernetes.io/projected/c904faa8-338a-4f9c-80fc-bad9d60139a0-kube-api-access-gpvql\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.252797 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5c9149-6a85-4e50-9569-6cc828e55a11-serving-cert\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.253946 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gmfx\" (UniqueName: \"kubernetes.io/projected/1d5c9149-6a85-4e50-9569-6cc828e55a11-kube-api-access-2gmfx\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.263111 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsxwz\" (UniqueName: \"kubernetes.io/projected/756dd25b-5375-48bc-8578-a9585ef49e6c-kube-api-access-vsxwz\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.269873 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ndrbx"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.271658 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.276375 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ndrbx"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.292549 4816 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-11T12:00:30.854557873Z","Handler":null,"Name":""} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.299833 4816 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.300034 4816 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.311048 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9fv28"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.325852 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.326091 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgdtg\" (UniqueName: \"kubernetes.io/projected/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-kube-api-access-tgdtg\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.326133 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-catalog-content\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.326148 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-utilities\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.351028 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.351003578 podStartE2EDuration="351.003578ms" podCreationTimestamp="2026-03-11 12:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:31.3332348 +0000 UTC m=+117.924498767" watchObservedRunningTime="2026-03-11 12:00:31.351003578 +0000 UTC m=+117.942267545" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.363039 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.378582 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.385511 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.399297 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.427681 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgdtg\" (UniqueName: \"kubernetes.io/projected/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-kube-api-access-tgdtg\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.428089 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-catalog-content\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.428113 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-utilities\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.428557 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-catalog-content\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.428648 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.428740 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-utilities\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.434125 4816 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.434168 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.451795 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgdtg\" (UniqueName: \"kubernetes.io/projected/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-kube-api-access-tgdtg\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.458274 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.494647 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jwq6f"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.586355 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.648036 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.837848 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx"] Mar 11 12:00:31 crc kubenswrapper[4816]: W0311 12:00:31.845455 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d5c9149_6a85_4e50_9569_6cc828e55a11.slice/crio-93986dc36dc0ff7dd563d707eb2a02368f044d4c0b1cecd71fb46a82e85f624f WatchSource:0}: Error finding container 93986dc36dc0ff7dd563d707eb2a02368f044d4c0b1cecd71fb46a82e85f624f: Status 404 returned error can't find the container with id 93986dc36dc0ff7dd563d707eb2a02368f044d4c0b1cecd71fb46a82e85f624f Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.864802 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2dh2"] Mar 11 12:00:31 crc kubenswrapper[4816]: W0311 12:00:31.877052 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod756dd25b_5375_48bc_8578_a9585ef49e6c.slice/crio-6c14169c95d372913c35e21942a8624231ab375807c0764494279b189e642cec WatchSource:0}: Error finding container 6c14169c95d372913c35e21942a8624231ab375807c0764494279b189e642cec: Status 404 returned error can't find the container with id 6c14169c95d372913c35e21942a8624231ab375807c0764494279b189e642cec Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.910308 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f7578748c-p527z"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.960657 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p426k"] Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.079719 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2dh2" event={"ID":"756dd25b-5375-48bc-8578-a9585ef49e6c","Type":"ContainerStarted","Data":"6c14169c95d372913c35e21942a8624231ab375807c0764494279b189e642cec"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.103654 4816 generic.go:334] "Generic (PLEG): container finished" podID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerID="3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd" exitCode=0 Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.103750 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwq6f" event={"ID":"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e","Type":"ContainerDied","Data":"3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.103780 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwq6f" event={"ID":"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e","Type":"ContainerStarted","Data":"499f7962c1697f289517091d9831d7c624088927518036ee83a281ffd5b62905"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.110713 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.113733 4816 generic.go:334] "Generic (PLEG): container finished" podID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerID="1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb" exitCode=0 Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.113815 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fv28" event={"ID":"8d6e662d-8633-4e55-baf3-50a2c4d179a1","Type":"ContainerDied","Data":"1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.113842 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fv28" event={"ID":"8d6e662d-8633-4e55-baf3-50a2c4d179a1","Type":"ContainerStarted","Data":"e00a61b1b339e0c135f2f8629c96ed94976ec15fddfa98352c7a50768117327d"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.122282 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" event={"ID":"9a7e3709-d407-4679-add6-375a835421be","Type":"ContainerStarted","Data":"ec23157cec86a7144fad1cf7ce6f1de12230714b1e857a2199a9972f099db0a1"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.139442 4816 patch_prober.go:28] interesting pod/route-controller-manager-6fb4858c9f-v88nx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.139517 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.157628 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.158726 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" event={"ID":"1d5c9149-6a85-4e50-9569-6cc828e55a11","Type":"ContainerStarted","Data":"93986dc36dc0ff7dd563d707eb2a02368f044d4c0b1cecd71fb46a82e85f624f"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.158783 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.158798 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" event={"ID":"c904faa8-338a-4f9c-80fc-bad9d60139a0","Type":"ContainerStarted","Data":"868efa371ca880139b71d36617be11ed32ba7747cf0e6a8180c51bf10cbc179c"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.161410 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" event={"ID":"ba5682ea-6a62-4983-b525-5dc9612ad46d","Type":"ContainerStarted","Data":"96b041a36f6aff46055c454f884cc9dfdcf1e7340fff69a4ab5d8c17f750bc64"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.161474 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.183916 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:32 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:32 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:32 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.183969 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.184750 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.184731483 podStartE2EDuration="184.731483ms" podCreationTimestamp="2026-03-11 12:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:32.181847221 +0000 UTC m=+118.773111188" watchObservedRunningTime="2026-03-11 12:00:32.184731483 +0000 UTC m=+118.775995450" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.200972 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" podStartSLOduration=3.200956718 podStartE2EDuration="3.200956718s" podCreationTimestamp="2026-03-11 12:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:32.200713161 +0000 UTC m=+118.791977118" watchObservedRunningTime="2026-03-11 12:00:32.200956718 +0000 UTC m=+118.792220685" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.219412 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ndrbx"] Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.510915 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-dh658 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.510982 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dh658" podUID="8c843417-3e01-48f9-b0b6-845fbbbf7eab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.511410 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-dh658 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.511459 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dh658" podUID="8c843417-3e01-48f9-b0b6-845fbbbf7eab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.605296 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.609987 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.631290 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" podStartSLOduration=11.63127035 podStartE2EDuration="11.63127035s" podCreationTimestamp="2026-03-11 12:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:32.260554723 +0000 UTC m=+118.851818690" watchObservedRunningTime="2026-03-11 12:00:32.63127035 +0000 UTC m=+119.222534317" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.653544 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rlvrz"] Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.654868 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.657282 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.674062 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlvrz"] Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.751608 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-catalog-content\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.751966 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-utilities\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.752010 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lvwq\" (UniqueName: \"kubernetes.io/projected/e94af1b5-09ef-433f-91e6-7b352836273d-kube-api-access-5lvwq\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.853102 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-utilities\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.853207 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lvwq\" (UniqueName: \"kubernetes.io/projected/e94af1b5-09ef-433f-91e6-7b352836273d-kube-api-access-5lvwq\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.853599 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-catalog-content\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.853846 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-utilities\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.853951 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-catalog-content\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.885267 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lvwq\" (UniqueName: \"kubernetes.io/projected/e94af1b5-09ef-433f-91e6-7b352836273d-kube-api-access-5lvwq\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.970829 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.035835 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.041714 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cg4jl"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.042703 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.060267 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg4jl"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.164179 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-utilities\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.165962 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-catalog-content\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.166060 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp48p\" (UniqueName: \"kubernetes.io/projected/34f226df-3352-4423-822c-67891ad3a398-kube-api-access-zp48p\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.166858 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:33 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:33 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:33 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.166893 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.204713 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" event={"ID":"9a7e3709-d407-4679-add6-375a835421be","Type":"ContainerStarted","Data":"29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf"} Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.204946 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.210854 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" event={"ID":"1d5c9149-6a85-4e50-9569-6cc828e55a11","Type":"ContainerStarted","Data":"25f0be79390049105752d95c1c8523ffd3475271c1e00a0aa23883ae8aa13fa1"} Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.214315 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" event={"ID":"c904faa8-338a-4f9c-80fc-bad9d60139a0","Type":"ContainerStarted","Data":"b15503803ae7b01b4347bf2f0cc032c1e2e36293189d891e7329a3636d682710"} Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.214512 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.224485 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.224883 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.230842 4816 generic.go:334] "Generic (PLEG): container finished" podID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerID="c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac" exitCode=0 Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.231518 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2dh2" event={"ID":"756dd25b-5375-48bc-8578-a9585ef49e6c","Type":"ContainerDied","Data":"c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac"} Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.238051 4816 generic.go:334] "Generic (PLEG): container finished" podID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerID="db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80" exitCode=0 Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.238862 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndrbx" event={"ID":"ffe46307-0d92-4864-9aa4-b0ca2fc641d0","Type":"ContainerDied","Data":"db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80"} Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.239591 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndrbx" event={"ID":"ffe46307-0d92-4864-9aa4-b0ca2fc641d0","Type":"ContainerStarted","Data":"496964d22446ecfb6c504cae509de586a0cc99c038e5375b83e1db6c09ad3706"} Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.240347 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" podStartSLOduration=73.240314048 podStartE2EDuration="1m13.240314048s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:33.239991688 +0000 UTC m=+119.831255665" watchObservedRunningTime="2026-03-11 12:00:33.240314048 +0000 UTC m=+119.831578015" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.268178 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-utilities\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.268234 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-catalog-content\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.268286 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp48p\" (UniqueName: \"kubernetes.io/projected/34f226df-3352-4423-822c-67891ad3a398-kube-api-access-zp48p\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.272468 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" podStartSLOduration=4.272447447 podStartE2EDuration="4.272447447s" podCreationTimestamp="2026-03-11 12:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:33.270834401 +0000 UTC m=+119.862098388" watchObservedRunningTime="2026-03-11 12:00:33.272447447 +0000 UTC m=+119.863711414" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.293538 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-utilities\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.293970 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-catalog-content\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.328865 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlvrz"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.347149 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp48p\" (UniqueName: \"kubernetes.io/projected/34f226df-3352-4423-822c-67891ad3a398-kube-api-access-zp48p\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.367488 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.599039 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.658419 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg4jl"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.702441 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.703609 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.707027 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.708542 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.709382 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.778303 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26fa831-2257-478d-a4dd-9b33c6a59198-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.778383 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26fa831-2257-478d-a4dd-9b33c6a59198-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.849811 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jtm2c"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.851118 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.852855 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.853191 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtm2c"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.879862 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26fa831-2257-478d-a4dd-9b33c6a59198-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.879905 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26fa831-2257-478d-a4dd-9b33c6a59198-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.879944 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-utilities\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.879962 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-catalog-content\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.879989 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thth9\" (UniqueName: \"kubernetes.io/projected/ce281163-d6c0-444b-ba55-b488dd77b853-kube-api-access-thth9\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.880390 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26fa831-2257-478d-a4dd-9b33c6a59198-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.902545 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26fa831-2257-478d-a4dd-9b33c6a59198-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.981079 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-catalog-content\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.981123 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-utilities\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.981153 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thth9\" (UniqueName: \"kubernetes.io/projected/ce281163-d6c0-444b-ba55-b488dd77b853-kube-api-access-thth9\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.981915 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-catalog-content\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.983138 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-utilities\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.997611 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thth9\" (UniqueName: \"kubernetes.io/projected/ce281163-d6c0-444b-ba55-b488dd77b853-kube-api-access-thth9\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.038194 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.046627 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.049798 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.049823 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.051678 4816 patch_prober.go:28] interesting pod/console-f9d7485db-blgl4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.051728 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-blgl4" podUID="efc988f7-8a1a-4d22-b6bb-b2617c721017" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.167339 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.188540 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:34 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:34 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:34 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.188584 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.192880 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.201540 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.246793 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8r7jt"] Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.248158 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.258001 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8r7jt"] Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.285492 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-utilities\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.285570 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgms4\" (UniqueName: \"kubernetes.io/projected/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-kube-api-access-dgms4\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.285666 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-catalog-content\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.291523 4816 generic.go:334] "Generic (PLEG): container finished" podID="e94af1b5-09ef-433f-91e6-7b352836273d" containerID="d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc" exitCode=0 Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.292911 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlvrz" event={"ID":"e94af1b5-09ef-433f-91e6-7b352836273d","Type":"ContainerDied","Data":"d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc"} Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.292952 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlvrz" event={"ID":"e94af1b5-09ef-433f-91e6-7b352836273d","Type":"ContainerStarted","Data":"a8cafecc50e94d07fe579d21307c54f31a39be731f47fedd9b733a84b5d89387"} Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.338476 4816 generic.go:334] "Generic (PLEG): container finished" podID="34f226df-3352-4423-822c-67891ad3a398" containerID="8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384" exitCode=0 Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.339537 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg4jl" event={"ID":"34f226df-3352-4423-822c-67891ad3a398","Type":"ContainerDied","Data":"8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384"} Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.339560 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg4jl" event={"ID":"34f226df-3352-4423-822c-67891ad3a398","Type":"ContainerStarted","Data":"d0c2bd9596db386896db7ff2b9f9f3f47d22ce171a97edaa6f0b88c2da2cae3b"} Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.389999 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgms4\" (UniqueName: \"kubernetes.io/projected/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-kube-api-access-dgms4\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.390054 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-catalog-content\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.390203 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-utilities\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.390732 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-utilities\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.398487 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-catalog-content\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.432034 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgms4\" (UniqueName: \"kubernetes.io/projected/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-kube-api-access-dgms4\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.448795 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.450026 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.456202 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.456320 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.465093 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.491872 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/729acc42-ae45-498b-8b45-a0307fa7951e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.492229 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/729acc42-ae45-498b-8b45-a0307fa7951e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.569816 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.580674 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.593937 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/729acc42-ae45-498b-8b45-a0307fa7951e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.594015 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/729acc42-ae45-498b-8b45-a0307fa7951e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.594150 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/729acc42-ae45-498b-8b45-a0307fa7951e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.604212 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54146: no serving certificate available for the kubelet" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.615677 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/729acc42-ae45-498b-8b45-a0307fa7951e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: E0311 12:00:34.616404 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:34 crc kubenswrapper[4816]: E0311 12:00:34.617970 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:34 crc kubenswrapper[4816]: E0311 12:00:34.632073 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:34 crc kubenswrapper[4816]: E0311 12:00:34.632145 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerName="kube-multus-additional-cni-plugins" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.791353 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.818267 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtm2c"] Mar 11 12:00:34 crc kubenswrapper[4816]: W0311 12:00:34.883909 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce281163_d6c0_444b_ba55_b488dd77b853.slice/crio-7e759bdc79b60bb3704deb1a705f04eeaaf47ec7245e831ed02fd21393a8ffe0 WatchSource:0}: Error finding container 7e759bdc79b60bb3704deb1a705f04eeaaf47ec7245e831ed02fd21393a8ffe0: Status 404 returned error can't find the container with id 7e759bdc79b60bb3704deb1a705f04eeaaf47ec7245e831ed02fd21393a8ffe0 Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.057234 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8r7jt"] Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.079680 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 12:00:35 crc kubenswrapper[4816]: W0311 12:00:35.090815 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod729acc42_ae45_498b_8b45_a0307fa7951e.slice/crio-3c3778da4fe076eac5dfb343e6905d7832d37f3e3f3b5af3c80125789356c195 WatchSource:0}: Error finding container 3c3778da4fe076eac5dfb343e6905d7832d37f3e3f3b5af3c80125789356c195: Status 404 returned error can't find the container with id 3c3778da4fe076eac5dfb343e6905d7832d37f3e3f3b5af3c80125789356c195 Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.167668 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:35 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:35 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:35 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.167938 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.357329 4816 generic.go:334] "Generic (PLEG): container finished" podID="3c040a86-9614-48cb-9df7-14c83b046dce" containerID="f3bda5d4e49a815a926b2f32c60f3932a76a7181a017078bc20f79926bfbf6a6" exitCode=0 Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.357454 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" event={"ID":"3c040a86-9614-48cb-9df7-14c83b046dce","Type":"ContainerDied","Data":"f3bda5d4e49a815a926b2f32c60f3932a76a7181a017078bc20f79926bfbf6a6"} Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.360345 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"729acc42-ae45-498b-8b45-a0307fa7951e","Type":"ContainerStarted","Data":"3c3778da4fe076eac5dfb343e6905d7832d37f3e3f3b5af3c80125789356c195"} Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.367289 4816 generic.go:334] "Generic (PLEG): container finished" podID="ce281163-d6c0-444b-ba55-b488dd77b853" containerID="3ab1f4b901f51b92d05dc18c4be8f53411d27fe11dfae52c40ff6b519e7e0cea" exitCode=0 Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.367358 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtm2c" event={"ID":"ce281163-d6c0-444b-ba55-b488dd77b853","Type":"ContainerDied","Data":"3ab1f4b901f51b92d05dc18c4be8f53411d27fe11dfae52c40ff6b519e7e0cea"} Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.367389 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtm2c" event={"ID":"ce281163-d6c0-444b-ba55-b488dd77b853","Type":"ContainerStarted","Data":"7e759bdc79b60bb3704deb1a705f04eeaaf47ec7245e831ed02fd21393a8ffe0"} Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.401445 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d26fa831-2257-478d-a4dd-9b33c6a59198","Type":"ContainerStarted","Data":"58707dc69b9de00c9ab7464906274475a3e993f4c7902adf0157e977c06dc9c3"} Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.401530 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d26fa831-2257-478d-a4dd-9b33c6a59198","Type":"ContainerStarted","Data":"3cc4bff0538c91690233bfb9bb26cddd773478266affdc0cbf69df64ec4f1cf1"} Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.406509 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r7jt" event={"ID":"d06617bd-ff11-42b8-9b84-e856c8c3c9eb","Type":"ContainerStarted","Data":"955eae43fa10fa99fdb4e5d4b56b2dccf5fef672fa575257d946cf0938c80e99"} Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.621888 4816 ???:1] "http: TLS handshake error from 192.168.126.11:53740: no serving certificate available for the kubelet" Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.165984 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:36 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:36 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:36 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.166328 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.436913 4816 generic.go:334] "Generic (PLEG): container finished" podID="d26fa831-2257-478d-a4dd-9b33c6a59198" containerID="58707dc69b9de00c9ab7464906274475a3e993f4c7902adf0157e977c06dc9c3" exitCode=0 Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.437023 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d26fa831-2257-478d-a4dd-9b33c6a59198","Type":"ContainerDied","Data":"58707dc69b9de00c9ab7464906274475a3e993f4c7902adf0157e977c06dc9c3"} Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.444837 4816 generic.go:334] "Generic (PLEG): container finished" podID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerID="380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757" exitCode=0 Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.444954 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r7jt" event={"ID":"d06617bd-ff11-42b8-9b84-e856c8c3c9eb","Type":"ContainerDied","Data":"380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757"} Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.456566 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"729acc42-ae45-498b-8b45-a0307fa7951e","Type":"ContainerStarted","Data":"2e0d42da96a573093e348b66293d58e2cc0ebf0eda6f03cabc520124bd4d6901"} Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.483871 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.483850067 podStartE2EDuration="2.483850067s" podCreationTimestamp="2026-03-11 12:00:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:36.477130345 +0000 UTC m=+123.068394312" watchObservedRunningTime="2026-03-11 12:00:36.483850067 +0000 UTC m=+123.075114034" Mar 11 12:00:37 crc kubenswrapper[4816]: I0311 12:00:37.165009 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:37 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:37 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:37 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:37 crc kubenswrapper[4816]: I0311 12:00:37.165072 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:37 crc kubenswrapper[4816]: I0311 12:00:37.480400 4816 generic.go:334] "Generic (PLEG): container finished" podID="729acc42-ae45-498b-8b45-a0307fa7951e" containerID="2e0d42da96a573093e348b66293d58e2cc0ebf0eda6f03cabc520124bd4d6901" exitCode=0 Mar 11 12:00:37 crc kubenswrapper[4816]: I0311 12:00:37.480490 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"729acc42-ae45-498b-8b45-a0307fa7951e","Type":"ContainerDied","Data":"2e0d42da96a573093e348b66293d58e2cc0ebf0eda6f03cabc520124bd4d6901"} Mar 11 12:00:38 crc kubenswrapper[4816]: I0311 12:00:38.178650 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:38 crc kubenswrapper[4816]: I0311 12:00:38.181304 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:39 crc kubenswrapper[4816]: I0311 12:00:39.199032 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:39 crc kubenswrapper[4816]: I0311 12:00:39.905158 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.516893 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dh658" Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.678760 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.751784 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/729acc42-ae45-498b-8b45-a0307fa7951e-kubelet-dir\") pod \"729acc42-ae45-498b-8b45-a0307fa7951e\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.751881 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/729acc42-ae45-498b-8b45-a0307fa7951e-kube-api-access\") pod \"729acc42-ae45-498b-8b45-a0307fa7951e\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.751900 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/729acc42-ae45-498b-8b45-a0307fa7951e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "729acc42-ae45-498b-8b45-a0307fa7951e" (UID: "729acc42-ae45-498b-8b45-a0307fa7951e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.752109 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/729acc42-ae45-498b-8b45-a0307fa7951e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.759853 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/729acc42-ae45-498b-8b45-a0307fa7951e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "729acc42-ae45-498b-8b45-a0307fa7951e" (UID: "729acc42-ae45-498b-8b45-a0307fa7951e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.854850 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/729acc42-ae45-498b-8b45-a0307fa7951e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.530427 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"729acc42-ae45-498b-8b45-a0307fa7951e","Type":"ContainerDied","Data":"3c3778da4fe076eac5dfb343e6905d7832d37f3e3f3b5af3c80125789356c195"} Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.530465 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c3778da4fe076eac5dfb343e6905d7832d37f3e3f3b5af3c80125789356c195" Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.530472 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.966802 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.973544 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c040a86-9614-48cb-9df7-14c83b046dce-config-volume\") pod \"3c040a86-9614-48cb-9df7-14c83b046dce\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.973606 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c040a86-9614-48cb-9df7-14c83b046dce-secret-volume\") pod \"3c040a86-9614-48cb-9df7-14c83b046dce\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.973666 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hbdh\" (UniqueName: \"kubernetes.io/projected/3c040a86-9614-48cb-9df7-14c83b046dce-kube-api-access-9hbdh\") pod \"3c040a86-9614-48cb-9df7-14c83b046dce\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.974488 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c040a86-9614-48cb-9df7-14c83b046dce-config-volume" (OuterVolumeSpecName: "config-volume") pod "3c040a86-9614-48cb-9df7-14c83b046dce" (UID: "3c040a86-9614-48cb-9df7-14c83b046dce"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.976150 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.979825 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c040a86-9614-48cb-9df7-14c83b046dce-kube-api-access-9hbdh" (OuterVolumeSpecName: "kube-api-access-9hbdh") pod "3c040a86-9614-48cb-9df7-14c83b046dce" (UID: "3c040a86-9614-48cb-9df7-14c83b046dce"). InnerVolumeSpecName "kube-api-access-9hbdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.980488 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c040a86-9614-48cb-9df7-14c83b046dce-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3c040a86-9614-48cb-9df7-14c83b046dce" (UID: "3c040a86-9614-48cb-9df7-14c83b046dce"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.074821 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c040a86-9614-48cb-9df7-14c83b046dce-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.074853 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c040a86-9614-48cb-9df7-14c83b046dce-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.074863 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hbdh\" (UniqueName: \"kubernetes.io/projected/3c040a86-9614-48cb-9df7-14c83b046dce-kube-api-access-9hbdh\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.081361 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.089082 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.176982 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26fa831-2257-478d-a4dd-9b33c6a59198-kubelet-dir\") pod \"d26fa831-2257-478d-a4dd-9b33c6a59198\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.177300 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26fa831-2257-478d-a4dd-9b33c6a59198-kube-api-access\") pod \"d26fa831-2257-478d-a4dd-9b33c6a59198\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.180001 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d26fa831-2257-478d-a4dd-9b33c6a59198-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d26fa831-2257-478d-a4dd-9b33c6a59198" (UID: "d26fa831-2257-478d-a4dd-9b33c6a59198"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.189706 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26fa831-2257-478d-a4dd-9b33c6a59198-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d26fa831-2257-478d-a4dd-9b33c6a59198" (UID: "d26fa831-2257-478d-a4dd-9b33c6a59198"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.284974 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26fa831-2257-478d-a4dd-9b33c6a59198-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.285002 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26fa831-2257-478d-a4dd-9b33c6a59198-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.545834 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.546425 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" event={"ID":"3c040a86-9614-48cb-9df7-14c83b046dce","Type":"ContainerDied","Data":"dbe4724e3bb10a60d2bcdde00ce0cce01eb1e6f17e7c5b379625a2f60d27762d"} Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.546503 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbe4724e3bb10a60d2bcdde00ce0cce01eb1e6f17e7c5b379625a2f60d27762d" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.557143 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.557174 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d26fa831-2257-478d-a4dd-9b33c6a59198","Type":"ContainerDied","Data":"3cc4bff0538c91690233bfb9bb26cddd773478266affdc0cbf69df64ec4f1cf1"} Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.557222 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cc4bff0538c91690233bfb9bb26cddd773478266affdc0cbf69df64ec4f1cf1" Mar 11 12:00:44 crc kubenswrapper[4816]: E0311 12:00:44.583970 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:44 crc kubenswrapper[4816]: E0311 12:00:44.585636 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:44 crc kubenswrapper[4816]: E0311 12:00:44.587485 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:44 crc kubenswrapper[4816]: E0311 12:00:44.587548 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerName="kube-multus-additional-cni-plugins" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.873959 4816 ???:1] "http: TLS handshake error from 192.168.126.11:52352: no serving certificate available for the kubelet" Mar 11 12:00:48 crc kubenswrapper[4816]: I0311 12:00:48.518816 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f7578748c-p527z"] Mar 11 12:00:48 crc kubenswrapper[4816]: I0311 12:00:48.519829 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" podUID="c904faa8-338a-4f9c-80fc-bad9d60139a0" containerName="controller-manager" containerID="cri-o://b15503803ae7b01b4347bf2f0cc032c1e2e36293189d891e7329a3636d682710" gracePeriod=30 Mar 11 12:00:48 crc kubenswrapper[4816]: I0311 12:00:48.533180 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx"] Mar 11 12:00:48 crc kubenswrapper[4816]: I0311 12:00:48.533499 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerName="route-controller-manager" containerID="cri-o://25f0be79390049105752d95c1c8523ffd3475271c1e00a0aa23883ae8aa13fa1" gracePeriod=30 Mar 11 12:00:49 crc kubenswrapper[4816]: I0311 12:00:49.600380 4816 generic.go:334] "Generic (PLEG): container finished" podID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerID="25f0be79390049105752d95c1c8523ffd3475271c1e00a0aa23883ae8aa13fa1" exitCode=0 Mar 11 12:00:49 crc kubenswrapper[4816]: I0311 12:00:49.600412 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" event={"ID":"1d5c9149-6a85-4e50-9569-6cc828e55a11","Type":"ContainerDied","Data":"25f0be79390049105752d95c1c8523ffd3475271c1e00a0aa23883ae8aa13fa1"} Mar 11 12:00:49 crc kubenswrapper[4816]: I0311 12:00:49.602635 4816 generic.go:334] "Generic (PLEG): container finished" podID="c904faa8-338a-4f9c-80fc-bad9d60139a0" containerID="b15503803ae7b01b4347bf2f0cc032c1e2e36293189d891e7329a3636d682710" exitCode=0 Mar 11 12:00:49 crc kubenswrapper[4816]: I0311 12:00:49.602664 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" event={"ID":"c904faa8-338a-4f9c-80fc-bad9d60139a0","Type":"ContainerDied","Data":"b15503803ae7b01b4347bf2f0cc032c1e2e36293189d891e7329a3636d682710"} Mar 11 12:00:51 crc kubenswrapper[4816]: I0311 12:00:51.387024 4816 patch_prober.go:28] interesting pod/route-controller-manager-6fb4858c9f-v88nx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Mar 11 12:00:51 crc kubenswrapper[4816]: I0311 12:00:51.387383 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Mar 11 12:00:51 crc kubenswrapper[4816]: I0311 12:00:51.592223 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:52 crc kubenswrapper[4816]: I0311 12:00:52.400632 4816 patch_prober.go:28] interesting pod/controller-manager-7f7578748c-p527z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 12:00:52 crc kubenswrapper[4816]: I0311 12:00:52.400711 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" podUID="c904faa8-338a-4f9c-80fc-bad9d60139a0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 12:00:54 crc kubenswrapper[4816]: E0311 12:00:54.584687 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:54 crc kubenswrapper[4816]: E0311 12:00:54.586751 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:54 crc kubenswrapper[4816]: E0311 12:00:54.588333 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:54 crc kubenswrapper[4816]: E0311 12:00:54.588366 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerName="kube-multus-additional-cni-plugins" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.361376 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.369111 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.369372 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fz9fg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9fv28_openshift-marketplace(8d6e662d-8633-4e55-baf3-50a2c4d179a1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.371007 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9fv28" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.390584 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr"] Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.391004 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c904faa8-338a-4f9c-80fc-bad9d60139a0" containerName="controller-manager" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.391102 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c904faa8-338a-4f9c-80fc-bad9d60139a0" containerName="controller-manager" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.391171 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="729acc42-ae45-498b-8b45-a0307fa7951e" containerName="pruner" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.391233 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="729acc42-ae45-498b-8b45-a0307fa7951e" containerName="pruner" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.391333 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c040a86-9614-48cb-9df7-14c83b046dce" containerName="collect-profiles" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.391439 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c040a86-9614-48cb-9df7-14c83b046dce" containerName="collect-profiles" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.391510 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26fa831-2257-478d-a4dd-9b33c6a59198" containerName="pruner" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.391569 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26fa831-2257-478d-a4dd-9b33c6a59198" containerName="pruner" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.391739 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c904faa8-338a-4f9c-80fc-bad9d60139a0" containerName="controller-manager" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.401817 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26fa831-2257-478d-a4dd-9b33c6a59198" containerName="pruner" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.401867 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="729acc42-ae45-498b-8b45-a0307fa7951e" containerName="pruner" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.401886 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c040a86-9614-48cb-9df7-14c83b046dce" containerName="collect-profiles" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.402608 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.407682 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr"] Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.459124 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.459380 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vsxwz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-s2dh2_openshift-marketplace(756dd25b-5375-48bc-8578-a9585ef49e6c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.460540 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-s2dh2" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.485362 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.485526 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xchpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jwq6f_openshift-marketplace(fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.486611 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jwq6f" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.541862 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-client-ca\") pod \"c904faa8-338a-4f9c-80fc-bad9d60139a0\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.541919 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-proxy-ca-bundles\") pod \"c904faa8-338a-4f9c-80fc-bad9d60139a0\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.541966 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-config\") pod \"c904faa8-338a-4f9c-80fc-bad9d60139a0\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.542012 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpvql\" (UniqueName: \"kubernetes.io/projected/c904faa8-338a-4f9c-80fc-bad9d60139a0-kube-api-access-gpvql\") pod \"c904faa8-338a-4f9c-80fc-bad9d60139a0\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.542771 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-client-ca" (OuterVolumeSpecName: "client-ca") pod "c904faa8-338a-4f9c-80fc-bad9d60139a0" (UID: "c904faa8-338a-4f9c-80fc-bad9d60139a0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.542805 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c904faa8-338a-4f9c-80fc-bad9d60139a0" (UID: "c904faa8-338a-4f9c-80fc-bad9d60139a0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.542964 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-config" (OuterVolumeSpecName: "config") pod "c904faa8-338a-4f9c-80fc-bad9d60139a0" (UID: "c904faa8-338a-4f9c-80fc-bad9d60139a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543017 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c904faa8-338a-4f9c-80fc-bad9d60139a0-serving-cert\") pod \"c904faa8-338a-4f9c-80fc-bad9d60139a0\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543239 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-config\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543365 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27a893ee-c824-4b3a-a1a7-270040291753-serving-cert\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543394 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgx8p\" (UniqueName: \"kubernetes.io/projected/27a893ee-c824-4b3a-a1a7-270040291753-kube-api-access-fgx8p\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543430 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-client-ca\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543477 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-proxy-ca-bundles\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543532 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543604 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543638 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.550535 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c904faa8-338a-4f9c-80fc-bad9d60139a0-kube-api-access-gpvql" (OuterVolumeSpecName: "kube-api-access-gpvql") pod "c904faa8-338a-4f9c-80fc-bad9d60139a0" (UID: "c904faa8-338a-4f9c-80fc-bad9d60139a0"). InnerVolumeSpecName "kube-api-access-gpvql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.551361 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c904faa8-338a-4f9c-80fc-bad9d60139a0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c904faa8-338a-4f9c-80fc-bad9d60139a0" (UID: "c904faa8-338a-4f9c-80fc-bad9d60139a0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.645979 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-proxy-ca-bundles\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.646066 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-config\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.646120 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27a893ee-c824-4b3a-a1a7-270040291753-serving-cert\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.646219 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgx8p\" (UniqueName: \"kubernetes.io/projected/27a893ee-c824-4b3a-a1a7-270040291753-kube-api-access-fgx8p\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.646287 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-client-ca\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.646347 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpvql\" (UniqueName: \"kubernetes.io/projected/c904faa8-338a-4f9c-80fc-bad9d60139a0-kube-api-access-gpvql\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.646393 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c904faa8-338a-4f9c-80fc-bad9d60139a0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.647337 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-client-ca\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.647637 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-proxy-ca-bundles\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.648088 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-config\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.651154 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27a893ee-c824-4b3a-a1a7-270040291753-serving-cert\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.668661 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgx8p\" (UniqueName: \"kubernetes.io/projected/27a893ee-c824-4b3a-a1a7-270040291753-kube-api-access-fgx8p\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.703758 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.705216 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" event={"ID":"c904faa8-338a-4f9c-80fc-bad9d60139a0","Type":"ContainerDied","Data":"868efa371ca880139b71d36617be11ed32ba7747cf0e6a8180c51bf10cbc179c"} Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.705339 4816 scope.go:117] "RemoveContainer" containerID="b15503803ae7b01b4347bf2f0cc032c1e2e36293189d891e7329a3636d682710" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.724152 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.793135 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f7578748c-p527z"] Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.796768 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f7578748c-p527z"] Mar 11 12:01:00 crc kubenswrapper[4816]: I0311 12:01:00.136762 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c904faa8-338a-4f9c-80fc-bad9d60139a0" path="/var/lib/kubelet/pods/c904faa8-338a-4f9c-80fc-bad9d60139a0/volumes" Mar 11 12:01:00 crc kubenswrapper[4816]: I0311 12:01:00.716337 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-6bx5p_546d4851-e1c7-418b-8ba6-5847e5f9efde/kube-multus-additional-cni-plugins/0.log" Mar 11 12:01:00 crc kubenswrapper[4816]: I0311 12:01:00.716908 4816 generic.go:334] "Generic (PLEG): container finished" podID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" exitCode=137 Mar 11 12:01:00 crc kubenswrapper[4816]: I0311 12:01:00.716959 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" event={"ID":"546d4851-e1c7-418b-8ba6-5847e5f9efde","Type":"ContainerDied","Data":"94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7"} Mar 11 12:01:01 crc kubenswrapper[4816]: E0311 12:01:01.370920 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9fv28" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" Mar 11 12:01:01 crc kubenswrapper[4816]: E0311 12:01:01.371013 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jwq6f" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" Mar 11 12:01:01 crc kubenswrapper[4816]: E0311 12:01:01.373303 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-s2dh2" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.423159 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.455102 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl"] Mar 11 12:01:01 crc kubenswrapper[4816]: E0311 12:01:01.457549 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerName="route-controller-manager" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.457572 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerName="route-controller-manager" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.457663 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerName="route-controller-manager" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.458078 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.460054 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl"] Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.588597 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-client-ca\") pod \"1d5c9149-6a85-4e50-9569-6cc828e55a11\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.588708 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5c9149-6a85-4e50-9569-6cc828e55a11-serving-cert\") pod \"1d5c9149-6a85-4e50-9569-6cc828e55a11\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.588790 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gmfx\" (UniqueName: \"kubernetes.io/projected/1d5c9149-6a85-4e50-9569-6cc828e55a11-kube-api-access-2gmfx\") pod \"1d5c9149-6a85-4e50-9569-6cc828e55a11\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.588813 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-config\") pod \"1d5c9149-6a85-4e50-9569-6cc828e55a11\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.588953 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-serving-cert\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.588991 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5695\" (UniqueName: \"kubernetes.io/projected/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-kube-api-access-s5695\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.589024 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-config\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.589042 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-client-ca\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.590202 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-config" (OuterVolumeSpecName: "config") pod "1d5c9149-6a85-4e50-9569-6cc828e55a11" (UID: "1d5c9149-6a85-4e50-9569-6cc828e55a11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.590966 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-client-ca" (OuterVolumeSpecName: "client-ca") pod "1d5c9149-6a85-4e50-9569-6cc828e55a11" (UID: "1d5c9149-6a85-4e50-9569-6cc828e55a11"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.594311 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d5c9149-6a85-4e50-9569-6cc828e55a11-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1d5c9149-6a85-4e50-9569-6cc828e55a11" (UID: "1d5c9149-6a85-4e50-9569-6cc828e55a11"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.594628 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d5c9149-6a85-4e50-9569-6cc828e55a11-kube-api-access-2gmfx" (OuterVolumeSpecName: "kube-api-access-2gmfx") pod "1d5c9149-6a85-4e50-9569-6cc828e55a11" (UID: "1d5c9149-6a85-4e50-9569-6cc828e55a11"). InnerVolumeSpecName "kube-api-access-2gmfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690565 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-serving-cert\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690636 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5695\" (UniqueName: \"kubernetes.io/projected/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-kube-api-access-s5695\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690687 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-config\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690710 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-client-ca\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690774 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690789 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5c9149-6a85-4e50-9569-6cc828e55a11-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690801 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gmfx\" (UniqueName: \"kubernetes.io/projected/1d5c9149-6a85-4e50-9569-6cc828e55a11-kube-api-access-2gmfx\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690814 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.691935 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-client-ca\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.692818 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-config\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.694314 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-serving-cert\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.706011 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5695\" (UniqueName: \"kubernetes.io/projected/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-kube-api-access-s5695\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.723751 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" event={"ID":"1d5c9149-6a85-4e50-9569-6cc828e55a11","Type":"ContainerDied","Data":"93986dc36dc0ff7dd563d707eb2a02368f044d4c0b1cecd71fb46a82e85f624f"} Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.723795 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.754012 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx"] Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.754073 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx"] Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.776666 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.138894 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" path="/var/lib/kubelet/pods/1d5c9149-6a85-4e50-9569-6cc828e55a11/volumes" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.386130 4816 patch_prober.go:28] interesting pod/route-controller-manager-6fb4858c9f-v88nx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.386203 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.859580 4816 scope.go:117] "RemoveContainer" containerID="25f0be79390049105752d95c1c8523ffd3475271c1e00a0aa23883ae8aa13fa1" Mar 11 12:01:02 crc kubenswrapper[4816]: E0311 12:01:02.904704 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 11 12:01:02 crc kubenswrapper[4816]: E0311 12:01:02.905276 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5lvwq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rlvrz_openshift-marketplace(e94af1b5-09ef-433f-91e6-7b352836273d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.906119 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-6bx5p_546d4851-e1c7-418b-8ba6-5847e5f9efde/kube-multus-additional-cni-plugins/0.log" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.906187 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:01:02 crc kubenswrapper[4816]: E0311 12:01:02.906466 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rlvrz" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.909124 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxw9g\" (UniqueName: \"kubernetes.io/projected/546d4851-e1c7-418b-8ba6-5847e5f9efde-kube-api-access-sxw9g\") pod \"546d4851-e1c7-418b-8ba6-5847e5f9efde\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.909177 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/546d4851-e1c7-418b-8ba6-5847e5f9efde-cni-sysctl-allowlist\") pod \"546d4851-e1c7-418b-8ba6-5847e5f9efde\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.909240 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/546d4851-e1c7-418b-8ba6-5847e5f9efde-tuning-conf-dir\") pod \"546d4851-e1c7-418b-8ba6-5847e5f9efde\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.909297 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/546d4851-e1c7-418b-8ba6-5847e5f9efde-ready\") pod \"546d4851-e1c7-418b-8ba6-5847e5f9efde\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.909821 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/546d4851-e1c7-418b-8ba6-5847e5f9efde-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "546d4851-e1c7-418b-8ba6-5847e5f9efde" (UID: "546d4851-e1c7-418b-8ba6-5847e5f9efde"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.910147 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/546d4851-e1c7-418b-8ba6-5847e5f9efde-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "546d4851-e1c7-418b-8ba6-5847e5f9efde" (UID: "546d4851-e1c7-418b-8ba6-5847e5f9efde"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.910816 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/546d4851-e1c7-418b-8ba6-5847e5f9efde-ready" (OuterVolumeSpecName: "ready") pod "546d4851-e1c7-418b-8ba6-5847e5f9efde" (UID: "546d4851-e1c7-418b-8ba6-5847e5f9efde"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.914371 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/546d4851-e1c7-418b-8ba6-5847e5f9efde-kube-api-access-sxw9g" (OuterVolumeSpecName: "kube-api-access-sxw9g") pod "546d4851-e1c7-418b-8ba6-5847e5f9efde" (UID: "546d4851-e1c7-418b-8ba6-5847e5f9efde"). InnerVolumeSpecName "kube-api-access-sxw9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.010847 4816 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/546d4851-e1c7-418b-8ba6-5847e5f9efde-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.010878 4816 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/546d4851-e1c7-418b-8ba6-5847e5f9efde-ready\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.010888 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxw9g\" (UniqueName: \"kubernetes.io/projected/546d4851-e1c7-418b-8ba6-5847e5f9efde-kube-api-access-sxw9g\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.010898 4816 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/546d4851-e1c7-418b-8ba6-5847e5f9efde-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.735989 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-6bx5p_546d4851-e1c7-418b-8ba6-5847e5f9efde/kube-multus-additional-cni-plugins/0.log" Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.736108 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.736095 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" event={"ID":"546d4851-e1c7-418b-8ba6-5847e5f9efde","Type":"ContainerDied","Data":"bb301579c908efd9a833ba2c76294edf97abc1c238aa669d3b8696cb61fa9a56"} Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.775707 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6bx5p"] Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.776353 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6bx5p"] Mar 11 12:01:04 crc kubenswrapper[4816]: I0311 12:01:04.136582 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" path="/var/lib/kubelet/pods/546d4851-e1c7-418b-8ba6-5847e5f9efde/volumes" Mar 11 12:01:04 crc kubenswrapper[4816]: I0311 12:01:04.213816 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.025106 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 12:01:06 crc kubenswrapper[4816]: E0311 12:01:06.025694 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerName="kube-multus-additional-cni-plugins" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.025708 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerName="kube-multus-additional-cni-plugins" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.025814 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerName="kube-multus-additional-cni-plugins" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.026298 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.028440 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.028684 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.044391 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.048914 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.048967 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.149698 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.149743 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.150127 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.176937 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.352325 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:07 crc kubenswrapper[4816]: E0311 12:01:07.070835 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rlvrz" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" Mar 11 12:01:07 crc kubenswrapper[4816]: I0311 12:01:07.445290 4816 scope.go:117] "RemoveContainer" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" Mar 11 12:01:07 crc kubenswrapper[4816]: I0311 12:01:07.666404 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr"] Mar 11 12:01:07 crc kubenswrapper[4816]: I0311 12:01:07.696356 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl"] Mar 11 12:01:07 crc kubenswrapper[4816]: I0311 12:01:07.764471 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" event={"ID":"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1","Type":"ContainerStarted","Data":"4d23328b31768c096f02f39298c1b22ed736efa862708623ddbcd093bc7ea791"} Mar 11 12:01:07 crc kubenswrapper[4816]: I0311 12:01:07.765442 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 12:01:07 crc kubenswrapper[4816]: I0311 12:01:07.766083 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" event={"ID":"27a893ee-c824-4b3a-a1a7-270040291753","Type":"ContainerStarted","Data":"6d8a80e724695a853b72862fcfc4d7a6e02121bd48950e2a771b5ef4a04ee4b8"} Mar 11 12:01:07 crc kubenswrapper[4816]: W0311 12:01:07.777529 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1a8048f1_34ce_48b3_a273_bc4905efd9a0.slice/crio-a59dfcea4654114bb3d368003f90d6affcd94be583c75e559ea9e65d751021dd WatchSource:0}: Error finding container a59dfcea4654114bb3d368003f90d6affcd94be583c75e559ea9e65d751021dd: Status 404 returned error can't find the container with id a59dfcea4654114bb3d368003f90d6affcd94be583c75e559ea9e65d751021dd Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.612294 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr"] Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.707675 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl"] Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.787994 4816 generic.go:334] "Generic (PLEG): container finished" podID="34f226df-3352-4423-822c-67891ad3a398" containerID="796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382" exitCode=0 Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.788095 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg4jl" event={"ID":"34f226df-3352-4423-822c-67891ad3a398","Type":"ContainerDied","Data":"796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.791358 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" event={"ID":"27a893ee-c824-4b3a-a1a7-270040291753","Type":"ContainerStarted","Data":"e34c7fd51c419ebcbf2509f97778647782961f1976493ea35a4a759ea50660ec"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.792445 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.794607 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r7jt" event={"ID":"d06617bd-ff11-42b8-9b84-e856c8c3c9eb","Type":"ContainerStarted","Data":"278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.799455 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1a8048f1-34ce-48b3-a273-bc4905efd9a0","Type":"ContainerStarted","Data":"6f08b7a0dd010a2b212934117f9352279e1a3b6e57801752f9f48c9d6215f346"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.799497 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1a8048f1-34ce-48b3-a273-bc4905efd9a0","Type":"ContainerStarted","Data":"a59dfcea4654114bb3d368003f90d6affcd94be583c75e559ea9e65d751021dd"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.801771 4816 generic.go:334] "Generic (PLEG): container finished" podID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerID="ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23" exitCode=0 Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.801843 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndrbx" event={"ID":"ffe46307-0d92-4864-9aa4-b0ca2fc641d0","Type":"ContainerDied","Data":"ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.805394 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtm2c" event={"ID":"ce281163-d6c0-444b-ba55-b488dd77b853","Type":"ContainerStarted","Data":"a7c62a5bd8897be83a617ee46b4e99b960de1d3c06824d144ebcc6c092953124"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.806064 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.815983 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" event={"ID":"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1","Type":"ContainerStarted","Data":"fd8060a81740d2d82d00ee7b672322ac337e0a0886f149a3bf4e8ecff6b410c9"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.816499 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.825388 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.861319 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" podStartSLOduration=20.861303283 podStartE2EDuration="20.861303283s" podCreationTimestamp="2026-03-11 12:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:08.85980324 +0000 UTC m=+155.451067207" watchObservedRunningTime="2026-03-11 12:01:08.861303283 +0000 UTC m=+155.452567250" Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.924193 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.924173552 podStartE2EDuration="2.924173552s" podCreationTimestamp="2026-03-11 12:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:08.921732092 +0000 UTC m=+155.512996059" watchObservedRunningTime="2026-03-11 12:01:08.924173552 +0000 UTC m=+155.515437519" Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.944657 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" podStartSLOduration=20.944639338 podStartE2EDuration="20.944639338s" podCreationTimestamp="2026-03-11 12:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:08.944185245 +0000 UTC m=+155.535449232" watchObservedRunningTime="2026-03-11 12:01:08.944639338 +0000 UTC m=+155.535903305" Mar 11 12:01:09 crc kubenswrapper[4816]: I0311 12:01:09.149090 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:01:09 crc kubenswrapper[4816]: I0311 12:01:09.824160 4816 generic.go:334] "Generic (PLEG): container finished" podID="1a8048f1-34ce-48b3-a273-bc4905efd9a0" containerID="6f08b7a0dd010a2b212934117f9352279e1a3b6e57801752f9f48c9d6215f346" exitCode=0 Mar 11 12:01:09 crc kubenswrapper[4816]: I0311 12:01:09.824278 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1a8048f1-34ce-48b3-a273-bc4905efd9a0","Type":"ContainerDied","Data":"6f08b7a0dd010a2b212934117f9352279e1a3b6e57801752f9f48c9d6215f346"} Mar 11 12:01:09 crc kubenswrapper[4816]: I0311 12:01:09.824433 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" podUID="157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" containerName="route-controller-manager" containerID="cri-o://fd8060a81740d2d82d00ee7b672322ac337e0a0886f149a3bf4e8ecff6b410c9" gracePeriod=30 Mar 11 12:01:09 crc kubenswrapper[4816]: I0311 12:01:09.824983 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" podUID="27a893ee-c824-4b3a-a1a7-270040291753" containerName="controller-manager" containerID="cri-o://e34c7fd51c419ebcbf2509f97778647782961f1976493ea35a4a759ea50660ec" gracePeriod=30 Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.838081 4816 generic.go:334] "Generic (PLEG): container finished" podID="ce281163-d6c0-444b-ba55-b488dd77b853" containerID="a7c62a5bd8897be83a617ee46b4e99b960de1d3c06824d144ebcc6c092953124" exitCode=0 Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.838143 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtm2c" event={"ID":"ce281163-d6c0-444b-ba55-b488dd77b853","Type":"ContainerDied","Data":"a7c62a5bd8897be83a617ee46b4e99b960de1d3c06824d144ebcc6c092953124"} Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.840152 4816 generic.go:334] "Generic (PLEG): container finished" podID="157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" containerID="fd8060a81740d2d82d00ee7b672322ac337e0a0886f149a3bf4e8ecff6b410c9" exitCode=0 Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.840286 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" event={"ID":"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1","Type":"ContainerDied","Data":"fd8060a81740d2d82d00ee7b672322ac337e0a0886f149a3bf4e8ecff6b410c9"} Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.842436 4816 generic.go:334] "Generic (PLEG): container finished" podID="27a893ee-c824-4b3a-a1a7-270040291753" containerID="e34c7fd51c419ebcbf2509f97778647782961f1976493ea35a4a759ea50660ec" exitCode=0 Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.842485 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" event={"ID":"27a893ee-c824-4b3a-a1a7-270040291753","Type":"ContainerDied","Data":"e34c7fd51c419ebcbf2509f97778647782961f1976493ea35a4a759ea50660ec"} Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.844809 4816 generic.go:334] "Generic (PLEG): container finished" podID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerID="278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a" exitCode=0 Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.844970 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r7jt" event={"ID":"d06617bd-ff11-42b8-9b84-e856c8c3c9eb","Type":"ContainerDied","Data":"278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a"} Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.965484 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.971974 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.992793 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk"] Mar 11 12:01:10 crc kubenswrapper[4816]: E0311 12:01:10.993000 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a893ee-c824-4b3a-a1a7-270040291753" containerName="controller-manager" Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.993011 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a893ee-c824-4b3a-a1a7-270040291753" containerName="controller-manager" Mar 11 12:01:10 crc kubenswrapper[4816]: E0311 12:01:10.993022 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" containerName="route-controller-manager" Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.993028 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" containerName="route-controller-manager" Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.993122 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a893ee-c824-4b3a-a1a7-270040291753" containerName="controller-manager" Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.993132 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" containerName="route-controller-manager" Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.993559 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.013069 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk"] Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041303 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-client-ca\") pod \"27a893ee-c824-4b3a-a1a7-270040291753\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041371 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-config\") pod \"27a893ee-c824-4b3a-a1a7-270040291753\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041411 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgx8p\" (UniqueName: \"kubernetes.io/projected/27a893ee-c824-4b3a-a1a7-270040291753-kube-api-access-fgx8p\") pod \"27a893ee-c824-4b3a-a1a7-270040291753\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041469 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27a893ee-c824-4b3a-a1a7-270040291753-serving-cert\") pod \"27a893ee-c824-4b3a-a1a7-270040291753\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041509 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-serving-cert\") pod \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041540 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-config\") pod \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041578 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5695\" (UniqueName: \"kubernetes.io/projected/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-kube-api-access-s5695\") pod \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041629 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-client-ca\") pod \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041686 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-proxy-ca-bundles\") pod \"27a893ee-c824-4b3a-a1a7-270040291753\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041880 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00e505e-4736-4aee-b340-ef223d36cf41-serving-cert\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041946 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-client-ca\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.042018 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqzfs\" (UniqueName: \"kubernetes.io/projected/e00e505e-4736-4aee-b340-ef223d36cf41-kube-api-access-lqzfs\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.042071 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-config\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.042313 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-config" (OuterVolumeSpecName: "config") pod "27a893ee-c824-4b3a-a1a7-270040291753" (UID: "27a893ee-c824-4b3a-a1a7-270040291753"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.043320 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "27a893ee-c824-4b3a-a1a7-270040291753" (UID: "27a893ee-c824-4b3a-a1a7-270040291753"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.043396 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-client-ca" (OuterVolumeSpecName: "client-ca") pod "27a893ee-c824-4b3a-a1a7-270040291753" (UID: "27a893ee-c824-4b3a-a1a7-270040291753"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.043564 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-client-ca" (OuterVolumeSpecName: "client-ca") pod "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" (UID: "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.043666 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-config" (OuterVolumeSpecName: "config") pod "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" (UID: "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.049760 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" (UID: "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.049800 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a893ee-c824-4b3a-a1a7-270040291753-kube-api-access-fgx8p" (OuterVolumeSpecName: "kube-api-access-fgx8p") pod "27a893ee-c824-4b3a-a1a7-270040291753" (UID: "27a893ee-c824-4b3a-a1a7-270040291753"). InnerVolumeSpecName "kube-api-access-fgx8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.053873 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-kube-api-access-s5695" (OuterVolumeSpecName: "kube-api-access-s5695") pod "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" (UID: "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1"). InnerVolumeSpecName "kube-api-access-s5695". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.054291 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a893ee-c824-4b3a-a1a7-270040291753-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "27a893ee-c824-4b3a-a1a7-270040291753" (UID: "27a893ee-c824-4b3a-a1a7-270040291753"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.080727 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.143552 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kube-api-access\") pod \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.143915 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kubelet-dir\") pod \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144117 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqzfs\" (UniqueName: \"kubernetes.io/projected/e00e505e-4736-4aee-b340-ef223d36cf41-kube-api-access-lqzfs\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144144 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-config\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144207 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00e505e-4736-4aee-b340-ef223d36cf41-serving-cert\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144228 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-client-ca\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144284 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144297 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144308 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144315 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144326 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgx8p\" (UniqueName: \"kubernetes.io/projected/27a893ee-c824-4b3a-a1a7-270040291753-kube-api-access-fgx8p\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144336 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27a893ee-c824-4b3a-a1a7-270040291753-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144344 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144353 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144362 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5695\" (UniqueName: \"kubernetes.io/projected/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-kube-api-access-s5695\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.145140 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-client-ca\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.145764 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1a8048f1-34ce-48b3-a273-bc4905efd9a0" (UID: "1a8048f1-34ce-48b3-a273-bc4905efd9a0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.146680 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-config\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.149555 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1a8048f1-34ce-48b3-a273-bc4905efd9a0" (UID: "1a8048f1-34ce-48b3-a273-bc4905efd9a0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.149901 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00e505e-4736-4aee-b340-ef223d36cf41-serving-cert\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.160282 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqzfs\" (UniqueName: \"kubernetes.io/projected/e00e505e-4736-4aee-b340-ef223d36cf41-kube-api-access-lqzfs\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.245007 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.245041 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.319421 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.528471 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk"] Mar 11 12:01:11 crc kubenswrapper[4816]: W0311 12:01:11.547005 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode00e505e_4736_4aee_b340_ef223d36cf41.slice/crio-867551228a25bbc0bc0f926b4429da94d98dd14f3c9ae4a640117cf20bb2787e WatchSource:0}: Error finding container 867551228a25bbc0bc0f926b4429da94d98dd14f3c9ae4a640117cf20bb2787e: Status 404 returned error can't find the container with id 867551228a25bbc0bc0f926b4429da94d98dd14f3c9ae4a640117cf20bb2787e Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.854587 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" event={"ID":"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1","Type":"ContainerDied","Data":"4d23328b31768c096f02f39298c1b22ed736efa862708623ddbcd093bc7ea791"} Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.854647 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.854656 4816 scope.go:117] "RemoveContainer" containerID="fd8060a81740d2d82d00ee7b672322ac337e0a0886f149a3bf4e8ecff6b410c9" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.857791 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.857807 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" event={"ID":"27a893ee-c824-4b3a-a1a7-270040291753","Type":"ContainerDied","Data":"6d8a80e724695a853b72862fcfc4d7a6e02121bd48950e2a771b5ef4a04ee4b8"} Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.864679 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg4jl" event={"ID":"34f226df-3352-4423-822c-67891ad3a398","Type":"ContainerStarted","Data":"db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792"} Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.867485 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" event={"ID":"e00e505e-4736-4aee-b340-ef223d36cf41","Type":"ContainerStarted","Data":"867551228a25bbc0bc0f926b4429da94d98dd14f3c9ae4a640117cf20bb2787e"} Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.869867 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1a8048f1-34ce-48b3-a273-bc4905efd9a0","Type":"ContainerDied","Data":"a59dfcea4654114bb3d368003f90d6affcd94be583c75e559ea9e65d751021dd"} Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.869894 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a59dfcea4654114bb3d368003f90d6affcd94be583c75e559ea9e65d751021dd" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.869913 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.883320 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cg4jl" podStartSLOduration=2.044149483 podStartE2EDuration="38.883288173s" podCreationTimestamp="2026-03-11 12:00:33 +0000 UTC" firstStartedPulling="2026-03-11 12:00:34.350552266 +0000 UTC m=+120.941816233" lastFinishedPulling="2026-03-11 12:01:11.189690946 +0000 UTC m=+157.780954923" observedRunningTime="2026-03-11 12:01:11.880633867 +0000 UTC m=+158.471897844" watchObservedRunningTime="2026-03-11 12:01:11.883288173 +0000 UTC m=+158.474552140" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.901717 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr"] Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.909098 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr"] Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.912150 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl"] Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.914600 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl"] Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.923784 4816 scope.go:117] "RemoveContainer" containerID="e34c7fd51c419ebcbf2509f97778647782961f1976493ea35a4a759ea50660ec" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.137280 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" path="/var/lib/kubelet/pods/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1/volumes" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.137978 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a893ee-c824-4b3a-a1a7-270040291753" path="/var/lib/kubelet/pods/27a893ee-c824-4b3a-a1a7-270040291753/volumes" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.822933 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 12:01:12 crc kubenswrapper[4816]: E0311 12:01:12.823445 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8048f1-34ce-48b3-a273-bc4905efd9a0" containerName="pruner" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.823461 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8048f1-34ce-48b3-a273-bc4905efd9a0" containerName="pruner" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.823565 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8048f1-34ce-48b3-a273-bc4905efd9a0" containerName="pruner" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.823956 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.825685 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.828541 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.835129 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.869064 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-var-lock\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.869130 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/106a80c4-7132-43b4-930f-bd886787437f-kube-api-access\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.869163 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.875040 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndrbx" event={"ID":"ffe46307-0d92-4864-9aa4-b0ca2fc641d0","Type":"ContainerStarted","Data":"53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56"} Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.880361 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" event={"ID":"e00e505e-4736-4aee-b340-ef223d36cf41","Type":"ContainerStarted","Data":"c88e59606b391072deb5c308ca0b530083322e3151190cbbfebe8fb7af0870a2"} Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.909496 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ndrbx" podStartSLOduration=3.234815295 podStartE2EDuration="41.909478917s" podCreationTimestamp="2026-03-11 12:00:31 +0000 UTC" firstStartedPulling="2026-03-11 12:00:33.249060988 +0000 UTC m=+119.840324955" lastFinishedPulling="2026-03-11 12:01:11.92372461 +0000 UTC m=+158.514988577" observedRunningTime="2026-03-11 12:01:12.905240246 +0000 UTC m=+159.496504213" watchObservedRunningTime="2026-03-11 12:01:12.909478917 +0000 UTC m=+159.500742884" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.919430 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" podStartSLOduration=4.919413621 podStartE2EDuration="4.919413621s" podCreationTimestamp="2026-03-11 12:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:12.917199058 +0000 UTC m=+159.508463035" watchObservedRunningTime="2026-03-11 12:01:12.919413621 +0000 UTC m=+159.510677588" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.970463 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/106a80c4-7132-43b4-930f-bd886787437f-kube-api-access\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.970557 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.970665 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-var-lock\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.971458 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.971523 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-var-lock\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.003223 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/106a80c4-7132-43b4-930f-bd886787437f-kube-api-access\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.083800 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-568c67664d-76gf4"] Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.084422 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.089318 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.089464 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.089556 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.089645 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.089709 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.089910 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.094736 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-568c67664d-76gf4"] Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.096786 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.173020 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eabc434-3f96-4124-9afc-ecb2466f2104-serving-cert\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.173076 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pklmp\" (UniqueName: \"kubernetes.io/projected/0eabc434-3f96-4124-9afc-ecb2466f2104-kube-api-access-pklmp\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.173101 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-client-ca\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.173359 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-config\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.173601 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-proxy-ca-bundles\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.190444 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.275810 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-proxy-ca-bundles\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.276028 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-proxy-ca-bundles\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.276278 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eabc434-3f96-4124-9afc-ecb2466f2104-serving-cert\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.276319 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pklmp\" (UniqueName: \"kubernetes.io/projected/0eabc434-3f96-4124-9afc-ecb2466f2104-kube-api-access-pklmp\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.276346 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-client-ca\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.276393 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-config\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.277524 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-client-ca\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.277747 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-config\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.290133 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eabc434-3f96-4124-9afc-ecb2466f2104-serving-cert\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.296632 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pklmp\" (UniqueName: \"kubernetes.io/projected/0eabc434-3f96-4124-9afc-ecb2466f2104-kube-api-access-pklmp\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.368644 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.368706 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.424073 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.690504 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-568c67664d-76gf4"] Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.753890 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.886824 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"106a80c4-7132-43b4-930f-bd886787437f","Type":"ContainerStarted","Data":"b87f445ca27d573faee92ddd515c624b2b710e714f620c36718ab43fc1a2134f"} Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.888982 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" event={"ID":"0eabc434-3f96-4124-9afc-ecb2466f2104","Type":"ContainerStarted","Data":"47dc6f78b21758a900162faa2d749022ed0a0a88a3ab047d3b81d585e1f50879"} Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.891182 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r7jt" event={"ID":"d06617bd-ff11-42b8-9b84-e856c8c3c9eb","Type":"ContainerStarted","Data":"3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a"} Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.891906 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.896763 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.910569 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8r7jt" podStartSLOduration=9.346380039 podStartE2EDuration="39.910555191s" podCreationTimestamp="2026-03-11 12:00:34 +0000 UTC" firstStartedPulling="2026-03-11 12:00:42.620613312 +0000 UTC m=+129.211877279" lastFinishedPulling="2026-03-11 12:01:13.184788464 +0000 UTC m=+159.776052431" observedRunningTime="2026-03-11 12:01:13.90738175 +0000 UTC m=+160.498645717" watchObservedRunningTime="2026-03-11 12:01:13.910555191 +0000 UTC m=+160.501819158" Mar 11 12:01:14 crc kubenswrapper[4816]: I0311 12:01:14.570679 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:01:14 crc kubenswrapper[4816]: I0311 12:01:14.570984 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:01:14 crc kubenswrapper[4816]: I0311 12:01:14.961120 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-cg4jl" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="registry-server" probeResult="failure" output=< Mar 11 12:01:14 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:01:14 crc kubenswrapper[4816]: > Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.636204 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8r7jt" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="registry-server" probeResult="failure" output=< Mar 11 12:01:15 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:01:15 crc kubenswrapper[4816]: > Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.902475 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtm2c" event={"ID":"ce281163-d6c0-444b-ba55-b488dd77b853","Type":"ContainerStarted","Data":"b4ab0057fec3813a8eba57d93db34dca15d692fb5d18b567388c379b9637e53f"} Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.903505 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"106a80c4-7132-43b4-930f-bd886787437f","Type":"ContainerStarted","Data":"13cc1621a3a1352dc36083505ef9245a833ca0fab13f1b74079c751c4ed90659"} Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.904778 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" event={"ID":"0eabc434-3f96-4124-9afc-ecb2466f2104","Type":"ContainerStarted","Data":"422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7"} Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.919926 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jtm2c" podStartSLOduration=11.119883892 podStartE2EDuration="42.919907186s" podCreationTimestamp="2026-03-11 12:00:33 +0000 UTC" firstStartedPulling="2026-03-11 12:00:42.612537541 +0000 UTC m=+129.203801508" lastFinishedPulling="2026-03-11 12:01:14.412560835 +0000 UTC m=+161.003824802" observedRunningTime="2026-03-11 12:01:15.916746125 +0000 UTC m=+162.508010092" watchObservedRunningTime="2026-03-11 12:01:15.919907186 +0000 UTC m=+162.511171163" Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.941751 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz2pp"] Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.950155 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" podStartSLOduration=7.950139691 podStartE2EDuration="7.950139691s" podCreationTimestamp="2026-03-11 12:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:15.948594026 +0000 UTC m=+162.539857993" watchObservedRunningTime="2026-03-11 12:01:15.950139691 +0000 UTC m=+162.541403658" Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.982419 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.982403904 podStartE2EDuration="3.982403904s" podCreationTimestamp="2026-03-11 12:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:15.978102601 +0000 UTC m=+162.569366588" watchObservedRunningTime="2026-03-11 12:01:15.982403904 +0000 UTC m=+162.573667871" Mar 11 12:01:16 crc kubenswrapper[4816]: I0311 12:01:16.910157 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:16 crc kubenswrapper[4816]: I0311 12:01:16.915869 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:17 crc kubenswrapper[4816]: I0311 12:01:17.915659 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2dh2" event={"ID":"756dd25b-5375-48bc-8578-a9585ef49e6c","Type":"ContainerStarted","Data":"3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca"} Mar 11 12:01:17 crc kubenswrapper[4816]: I0311 12:01:17.917037 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwq6f" event={"ID":"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e","Type":"ContainerStarted","Data":"af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247"} Mar 11 12:01:17 crc kubenswrapper[4816]: I0311 12:01:17.919233 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fv28" event={"ID":"8d6e662d-8633-4e55-baf3-50a2c4d179a1","Type":"ContainerStarted","Data":"4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17"} Mar 11 12:01:19 crc kubenswrapper[4816]: I0311 12:01:19.931357 4816 generic.go:334] "Generic (PLEG): container finished" podID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerID="af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247" exitCode=0 Mar 11 12:01:19 crc kubenswrapper[4816]: I0311 12:01:19.931441 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwq6f" event={"ID":"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e","Type":"ContainerDied","Data":"af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247"} Mar 11 12:01:19 crc kubenswrapper[4816]: I0311 12:01:19.934785 4816 generic.go:334] "Generic (PLEG): container finished" podID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerID="4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17" exitCode=0 Mar 11 12:01:19 crc kubenswrapper[4816]: I0311 12:01:19.934858 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fv28" event={"ID":"8d6e662d-8633-4e55-baf3-50a2c4d179a1","Type":"ContainerDied","Data":"4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17"} Mar 11 12:01:19 crc kubenswrapper[4816]: I0311 12:01:19.936782 4816 generic.go:334] "Generic (PLEG): container finished" podID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerID="3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca" exitCode=0 Mar 11 12:01:19 crc kubenswrapper[4816]: I0311 12:01:19.936808 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2dh2" event={"ID":"756dd25b-5375-48bc-8578-a9585ef49e6c","Type":"ContainerDied","Data":"3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca"} Mar 11 12:01:20 crc kubenswrapper[4816]: I0311 12:01:20.944807 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2dh2" event={"ID":"756dd25b-5375-48bc-8578-a9585ef49e6c","Type":"ContainerStarted","Data":"5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc"} Mar 11 12:01:20 crc kubenswrapper[4816]: I0311 12:01:20.949290 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwq6f" event={"ID":"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e","Type":"ContainerStarted","Data":"33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783"} Mar 11 12:01:20 crc kubenswrapper[4816]: I0311 12:01:20.951433 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fv28" event={"ID":"8d6e662d-8633-4e55-baf3-50a2c4d179a1","Type":"ContainerStarted","Data":"362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98"} Mar 11 12:01:20 crc kubenswrapper[4816]: I0311 12:01:20.962401 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:01:20 crc kubenswrapper[4816]: I0311 12:01:20.962467 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:01:20 crc kubenswrapper[4816]: I0311 12:01:20.980806 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s2dh2" podStartSLOduration=2.78523218 podStartE2EDuration="49.980783516s" podCreationTimestamp="2026-03-11 12:00:31 +0000 UTC" firstStartedPulling="2026-03-11 12:00:33.235817969 +0000 UTC m=+119.827081946" lastFinishedPulling="2026-03-11 12:01:20.431369315 +0000 UTC m=+167.022633282" observedRunningTime="2026-03-11 12:01:20.978492871 +0000 UTC m=+167.569756858" watchObservedRunningTime="2026-03-11 12:01:20.980783516 +0000 UTC m=+167.572047483" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.005754 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jwq6f" podStartSLOduration=2.726452054 podStartE2EDuration="51.0057348s" podCreationTimestamp="2026-03-11 12:00:30 +0000 UTC" firstStartedPulling="2026-03-11 12:00:32.110440418 +0000 UTC m=+118.701704385" lastFinishedPulling="2026-03-11 12:01:20.389723164 +0000 UTC m=+166.980987131" observedRunningTime="2026-03-11 12:01:21.003331831 +0000 UTC m=+167.594595808" watchObservedRunningTime="2026-03-11 12:01:21.0057348 +0000 UTC m=+167.596998767" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.023947 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9fv28" podStartSLOduration=2.80319518 podStartE2EDuration="51.023929961s" podCreationTimestamp="2026-03-11 12:00:30 +0000 UTC" firstStartedPulling="2026-03-11 12:00:32.115173573 +0000 UTC m=+118.706437540" lastFinishedPulling="2026-03-11 12:01:20.335908354 +0000 UTC m=+166.927172321" observedRunningTime="2026-03-11 12:01:21.02251217 +0000 UTC m=+167.613776137" watchObservedRunningTime="2026-03-11 12:01:21.023929961 +0000 UTC m=+167.615193928" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.183876 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.184038 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.363599 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.363649 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.648523 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.648571 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.718864 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.959092 4816 generic.go:334] "Generic (PLEG): container finished" podID="e94af1b5-09ef-433f-91e6-7b352836273d" containerID="e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941" exitCode=0 Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.959188 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlvrz" event={"ID":"e94af1b5-09ef-433f-91e6-7b352836273d","Type":"ContainerDied","Data":"e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941"} Mar 11 12:01:22 crc kubenswrapper[4816]: I0311 12:01:22.011067 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:01:22 crc kubenswrapper[4816]: I0311 12:01:22.013948 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9fv28" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="registry-server" probeResult="failure" output=< Mar 11 12:01:22 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:01:22 crc kubenswrapper[4816]: > Mar 11 12:01:22 crc kubenswrapper[4816]: I0311 12:01:22.269534 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jwq6f" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="registry-server" probeResult="failure" output=< Mar 11 12:01:22 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:01:22 crc kubenswrapper[4816]: > Mar 11 12:01:22 crc kubenswrapper[4816]: I0311 12:01:22.402484 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-s2dh2" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="registry-server" probeResult="failure" output=< Mar 11 12:01:22 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:01:22 crc kubenswrapper[4816]: > Mar 11 12:01:23 crc kubenswrapper[4816]: I0311 12:01:23.417825 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:01:23 crc kubenswrapper[4816]: I0311 12:01:23.460431 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:01:23 crc kubenswrapper[4816]: I0311 12:01:23.975221 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlvrz" event={"ID":"e94af1b5-09ef-433f-91e6-7b352836273d","Type":"ContainerStarted","Data":"37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17"} Mar 11 12:01:23 crc kubenswrapper[4816]: I0311 12:01:23.995873 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rlvrz" podStartSLOduration=3.542966326 podStartE2EDuration="51.995852678s" podCreationTimestamp="2026-03-11 12:00:32 +0000 UTC" firstStartedPulling="2026-03-11 12:00:34.293901945 +0000 UTC m=+120.885165912" lastFinishedPulling="2026-03-11 12:01:22.746788297 +0000 UTC m=+169.338052264" observedRunningTime="2026-03-11 12:01:23.994582752 +0000 UTC m=+170.585846759" watchObservedRunningTime="2026-03-11 12:01:23.995852678 +0000 UTC m=+170.587116645" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.032051 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg4jl"] Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.202616 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.203088 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.230868 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ndrbx"] Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.231228 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ndrbx" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="registry-server" containerID="cri-o://53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56" gracePeriod=2 Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.247763 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.623734 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.673336 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.704308 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.737140 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-catalog-content\") pod \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.737548 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgdtg\" (UniqueName: \"kubernetes.io/projected/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-kube-api-access-tgdtg\") pod \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.737596 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-utilities\") pod \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.738651 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-utilities" (OuterVolumeSpecName: "utilities") pod "ffe46307-0d92-4864-9aa4-b0ca2fc641d0" (UID: "ffe46307-0d92-4864-9aa4-b0ca2fc641d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.743173 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-kube-api-access-tgdtg" (OuterVolumeSpecName: "kube-api-access-tgdtg") pod "ffe46307-0d92-4864-9aa4-b0ca2fc641d0" (UID: "ffe46307-0d92-4864-9aa4-b0ca2fc641d0"). InnerVolumeSpecName "kube-api-access-tgdtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.792630 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffe46307-0d92-4864-9aa4-b0ca2fc641d0" (UID: "ffe46307-0d92-4864-9aa4-b0ca2fc641d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.839138 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.839185 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.839202 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgdtg\" (UniqueName: \"kubernetes.io/projected/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-kube-api-access-tgdtg\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.980950 4816 generic.go:334] "Generic (PLEG): container finished" podID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerID="53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56" exitCode=0 Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.980992 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndrbx" event={"ID":"ffe46307-0d92-4864-9aa4-b0ca2fc641d0","Type":"ContainerDied","Data":"53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56"} Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.981035 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndrbx" event={"ID":"ffe46307-0d92-4864-9aa4-b0ca2fc641d0","Type":"ContainerDied","Data":"496964d22446ecfb6c504cae509de586a0cc99c038e5375b83e1db6c09ad3706"} Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.981056 4816 scope.go:117] "RemoveContainer" containerID="53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.981088 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.981500 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cg4jl" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="registry-server" containerID="cri-o://db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792" gracePeriod=2 Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.998122 4816 scope.go:117] "RemoveContainer" containerID="ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.013336 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ndrbx"] Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.015996 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ndrbx"] Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.029838 4816 scope.go:117] "RemoveContainer" containerID="db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.029930 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.088910 4816 scope.go:117] "RemoveContainer" containerID="53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56" Mar 11 12:01:25 crc kubenswrapper[4816]: E0311 12:01:25.089273 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56\": container with ID starting with 53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56 not found: ID does not exist" containerID="53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.089312 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56"} err="failed to get container status \"53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56\": rpc error: code = NotFound desc = could not find container \"53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56\": container with ID starting with 53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56 not found: ID does not exist" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.089336 4816 scope.go:117] "RemoveContainer" containerID="ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23" Mar 11 12:01:25 crc kubenswrapper[4816]: E0311 12:01:25.089682 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23\": container with ID starting with ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23 not found: ID does not exist" containerID="ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.089709 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23"} err="failed to get container status \"ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23\": rpc error: code = NotFound desc = could not find container \"ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23\": container with ID starting with ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23 not found: ID does not exist" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.089723 4816 scope.go:117] "RemoveContainer" containerID="db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80" Mar 11 12:01:25 crc kubenswrapper[4816]: E0311 12:01:25.089933 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80\": container with ID starting with db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80 not found: ID does not exist" containerID="db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.089953 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80"} err="failed to get container status \"db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80\": rpc error: code = NotFound desc = could not find container \"db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80\": container with ID starting with db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80 not found: ID does not exist" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.439094 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.548087 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-catalog-content\") pod \"34f226df-3352-4423-822c-67891ad3a398\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.548141 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-utilities\") pod \"34f226df-3352-4423-822c-67891ad3a398\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.548186 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp48p\" (UniqueName: \"kubernetes.io/projected/34f226df-3352-4423-822c-67891ad3a398-kube-api-access-zp48p\") pod \"34f226df-3352-4423-822c-67891ad3a398\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.548984 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-utilities" (OuterVolumeSpecName: "utilities") pod "34f226df-3352-4423-822c-67891ad3a398" (UID: "34f226df-3352-4423-822c-67891ad3a398"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.558565 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f226df-3352-4423-822c-67891ad3a398-kube-api-access-zp48p" (OuterVolumeSpecName: "kube-api-access-zp48p") pod "34f226df-3352-4423-822c-67891ad3a398" (UID: "34f226df-3352-4423-822c-67891ad3a398"). InnerVolumeSpecName "kube-api-access-zp48p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.576484 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34f226df-3352-4423-822c-67891ad3a398" (UID: "34f226df-3352-4423-822c-67891ad3a398"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.649849 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.649882 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.649894 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp48p\" (UniqueName: \"kubernetes.io/projected/34f226df-3352-4423-822c-67891ad3a398-kube-api-access-zp48p\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.856282 4816 ???:1] "http: TLS handshake error from 192.168.126.11:48634: no serving certificate available for the kubelet" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.992693 4816 generic.go:334] "Generic (PLEG): container finished" podID="34f226df-3352-4423-822c-67891ad3a398" containerID="db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792" exitCode=0 Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.992758 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.992749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg4jl" event={"ID":"34f226df-3352-4423-822c-67891ad3a398","Type":"ContainerDied","Data":"db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792"} Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.992835 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg4jl" event={"ID":"34f226df-3352-4423-822c-67891ad3a398","Type":"ContainerDied","Data":"d0c2bd9596db386896db7ff2b9f9f3f47d22ce171a97edaa6f0b88c2da2cae3b"} Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.992870 4816 scope.go:117] "RemoveContainer" containerID="db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.008959 4816 scope.go:117] "RemoveContainer" containerID="796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.019229 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg4jl"] Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.021854 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg4jl"] Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.047271 4816 scope.go:117] "RemoveContainer" containerID="8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.059534 4816 scope.go:117] "RemoveContainer" containerID="db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792" Mar 11 12:01:26 crc kubenswrapper[4816]: E0311 12:01:26.060009 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792\": container with ID starting with db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792 not found: ID does not exist" containerID="db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.060057 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792"} err="failed to get container status \"db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792\": rpc error: code = NotFound desc = could not find container \"db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792\": container with ID starting with db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792 not found: ID does not exist" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.060087 4816 scope.go:117] "RemoveContainer" containerID="796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382" Mar 11 12:01:26 crc kubenswrapper[4816]: E0311 12:01:26.060524 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382\": container with ID starting with 796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382 not found: ID does not exist" containerID="796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.060552 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382"} err="failed to get container status \"796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382\": rpc error: code = NotFound desc = could not find container \"796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382\": container with ID starting with 796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382 not found: ID does not exist" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.060574 4816 scope.go:117] "RemoveContainer" containerID="8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384" Mar 11 12:01:26 crc kubenswrapper[4816]: E0311 12:01:26.060833 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384\": container with ID starting with 8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384 not found: ID does not exist" containerID="8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.060853 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384"} err="failed to get container status \"8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384\": rpc error: code = NotFound desc = could not find container \"8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384\": container with ID starting with 8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384 not found: ID does not exist" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.136979 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34f226df-3352-4423-822c-67891ad3a398" path="/var/lib/kubelet/pods/34f226df-3352-4423-822c-67891ad3a398/volumes" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.137665 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" path="/var/lib/kubelet/pods/ffe46307-0d92-4864-9aa4-b0ca2fc641d0/volumes" Mar 11 12:01:28 crc kubenswrapper[4816]: I0311 12:01:28.527330 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-568c67664d-76gf4"] Mar 11 12:01:28 crc kubenswrapper[4816]: I0311 12:01:28.527926 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" podUID="0eabc434-3f96-4124-9afc-ecb2466f2104" containerName="controller-manager" containerID="cri-o://422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7" gracePeriod=30 Mar 11 12:01:28 crc kubenswrapper[4816]: I0311 12:01:28.537652 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk"] Mar 11 12:01:28 crc kubenswrapper[4816]: I0311 12:01:28.537978 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" podUID="e00e505e-4736-4aee-b340-ef223d36cf41" containerName="route-controller-manager" containerID="cri-o://c88e59606b391072deb5c308ca0b530083322e3151190cbbfebe8fb7af0870a2" gracePeriod=30 Mar 11 12:01:28 crc kubenswrapper[4816]: I0311 12:01:28.626547 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8r7jt"] Mar 11 12:01:28 crc kubenswrapper[4816]: I0311 12:01:28.626759 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8r7jt" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="registry-server" containerID="cri-o://3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a" gracePeriod=2 Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.014295 4816 generic.go:334] "Generic (PLEG): container finished" podID="e00e505e-4736-4aee-b340-ef223d36cf41" containerID="c88e59606b391072deb5c308ca0b530083322e3151190cbbfebe8fb7af0870a2" exitCode=0 Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.014378 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" event={"ID":"e00e505e-4736-4aee-b340-ef223d36cf41","Type":"ContainerDied","Data":"c88e59606b391072deb5c308ca0b530083322e3151190cbbfebe8fb7af0870a2"} Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.843297 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.849986 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.855892 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.874513 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw"] Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876280 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876305 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876325 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="extract-content" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876332 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="extract-content" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876345 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="extract-utilities" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876354 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="extract-utilities" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876363 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="extract-content" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876369 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="extract-content" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876377 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876383 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876395 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00e505e-4736-4aee-b340-ef223d36cf41" containerName="route-controller-manager" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876402 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00e505e-4736-4aee-b340-ef223d36cf41" containerName="route-controller-manager" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876412 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="extract-utilities" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876420 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="extract-utilities" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876432 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="extract-utilities" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876441 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="extract-utilities" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876453 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876463 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876477 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="extract-content" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876486 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="extract-content" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876496 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eabc434-3f96-4124-9afc-ecb2466f2104" containerName="controller-manager" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876506 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eabc434-3f96-4124-9afc-ecb2466f2104" containerName="controller-manager" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876734 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876751 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876760 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eabc434-3f96-4124-9afc-ecb2466f2104" containerName="controller-manager" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876770 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00e505e-4736-4aee-b340-ef223d36cf41" containerName="route-controller-manager" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876783 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.877404 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.895462 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw"] Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.901874 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pklmp\" (UniqueName: \"kubernetes.io/projected/0eabc434-3f96-4124-9afc-ecb2466f2104-kube-api-access-pklmp\") pod \"0eabc434-3f96-4124-9afc-ecb2466f2104\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.901936 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-client-ca\") pod \"e00e505e-4736-4aee-b340-ef223d36cf41\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.901967 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eabc434-3f96-4124-9afc-ecb2466f2104-serving-cert\") pod \"0eabc434-3f96-4124-9afc-ecb2466f2104\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902011 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-config\") pod \"e00e505e-4736-4aee-b340-ef223d36cf41\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902045 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-config\") pod \"0eabc434-3f96-4124-9afc-ecb2466f2104\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902085 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-utilities\") pod \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902122 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-proxy-ca-bundles\") pod \"0eabc434-3f96-4124-9afc-ecb2466f2104\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902165 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqzfs\" (UniqueName: \"kubernetes.io/projected/e00e505e-4736-4aee-b340-ef223d36cf41-kube-api-access-lqzfs\") pod \"e00e505e-4736-4aee-b340-ef223d36cf41\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902271 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-client-ca\") pod \"0eabc434-3f96-4124-9afc-ecb2466f2104\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902312 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00e505e-4736-4aee-b340-ef223d36cf41-serving-cert\") pod \"e00e505e-4736-4aee-b340-ef223d36cf41\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902345 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-catalog-content\") pod \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902376 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgms4\" (UniqueName: \"kubernetes.io/projected/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-kube-api-access-dgms4\") pod \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902690 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-client-ca\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902727 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f862e1d6-c9a4-432c-b01f-610dac0371d6-serving-cert\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902780 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rxvx\" (UniqueName: \"kubernetes.io/projected/f862e1d6-c9a4-432c-b01f-610dac0371d6-kube-api-access-9rxvx\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902848 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-config\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.903566 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-client-ca" (OuterVolumeSpecName: "client-ca") pod "e00e505e-4736-4aee-b340-ef223d36cf41" (UID: "e00e505e-4736-4aee-b340-ef223d36cf41"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.903605 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-config" (OuterVolumeSpecName: "config") pod "0eabc434-3f96-4124-9afc-ecb2466f2104" (UID: "0eabc434-3f96-4124-9afc-ecb2466f2104"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.904133 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-utilities" (OuterVolumeSpecName: "utilities") pod "d06617bd-ff11-42b8-9b84-e856c8c3c9eb" (UID: "d06617bd-ff11-42b8-9b84-e856c8c3c9eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.904992 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0eabc434-3f96-4124-9afc-ecb2466f2104" (UID: "0eabc434-3f96-4124-9afc-ecb2466f2104"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.905156 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-config" (OuterVolumeSpecName: "config") pod "e00e505e-4736-4aee-b340-ef223d36cf41" (UID: "e00e505e-4736-4aee-b340-ef223d36cf41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.906873 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-client-ca" (OuterVolumeSpecName: "client-ca") pod "0eabc434-3f96-4124-9afc-ecb2466f2104" (UID: "0eabc434-3f96-4124-9afc-ecb2466f2104"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.908610 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eabc434-3f96-4124-9afc-ecb2466f2104-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0eabc434-3f96-4124-9afc-ecb2466f2104" (UID: "0eabc434-3f96-4124-9afc-ecb2466f2104"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.913866 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e00e505e-4736-4aee-b340-ef223d36cf41-kube-api-access-lqzfs" (OuterVolumeSpecName: "kube-api-access-lqzfs") pod "e00e505e-4736-4aee-b340-ef223d36cf41" (UID: "e00e505e-4736-4aee-b340-ef223d36cf41"). InnerVolumeSpecName "kube-api-access-lqzfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.914052 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eabc434-3f96-4124-9afc-ecb2466f2104-kube-api-access-pklmp" (OuterVolumeSpecName: "kube-api-access-pklmp") pod "0eabc434-3f96-4124-9afc-ecb2466f2104" (UID: "0eabc434-3f96-4124-9afc-ecb2466f2104"). InnerVolumeSpecName "kube-api-access-pklmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.924450 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-kube-api-access-dgms4" (OuterVolumeSpecName: "kube-api-access-dgms4") pod "d06617bd-ff11-42b8-9b84-e856c8c3c9eb" (UID: "d06617bd-ff11-42b8-9b84-e856c8c3c9eb"). InnerVolumeSpecName "kube-api-access-dgms4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.925426 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00e505e-4736-4aee-b340-ef223d36cf41-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e00e505e-4736-4aee-b340-ef223d36cf41" (UID: "e00e505e-4736-4aee-b340-ef223d36cf41"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.003878 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-config\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.003955 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-client-ca\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.003981 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f862e1d6-c9a4-432c-b01f-610dac0371d6-serving-cert\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004018 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rxvx\" (UniqueName: \"kubernetes.io/projected/f862e1d6-c9a4-432c-b01f-610dac0371d6-kube-api-access-9rxvx\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004062 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004074 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00e505e-4736-4aee-b340-ef223d36cf41-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004083 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgms4\" (UniqueName: \"kubernetes.io/projected/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-kube-api-access-dgms4\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004094 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pklmp\" (UniqueName: \"kubernetes.io/projected/0eabc434-3f96-4124-9afc-ecb2466f2104-kube-api-access-pklmp\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004102 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004112 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eabc434-3f96-4124-9afc-ecb2466f2104-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004122 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004134 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004261 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004405 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004449 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqzfs\" (UniqueName: \"kubernetes.io/projected/e00e505e-4736-4aee-b340-ef223d36cf41-kube-api-access-lqzfs\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.005833 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-client-ca\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.006004 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-config\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.007413 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f862e1d6-c9a4-432c-b01f-610dac0371d6-serving-cert\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.021225 4816 generic.go:334] "Generic (PLEG): container finished" podID="0eabc434-3f96-4124-9afc-ecb2466f2104" containerID="422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7" exitCode=0 Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.021318 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" event={"ID":"0eabc434-3f96-4124-9afc-ecb2466f2104","Type":"ContainerDied","Data":"422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7"} Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.021351 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" event={"ID":"0eabc434-3f96-4124-9afc-ecb2466f2104","Type":"ContainerDied","Data":"47dc6f78b21758a900162faa2d749022ed0a0a88a3ab047d3b81d585e1f50879"} Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.021368 4816 scope.go:117] "RemoveContainer" containerID="422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.021489 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.033798 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rxvx\" (UniqueName: \"kubernetes.io/projected/f862e1d6-c9a4-432c-b01f-610dac0371d6-kube-api-access-9rxvx\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.036282 4816 generic.go:334] "Generic (PLEG): container finished" podID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerID="3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a" exitCode=0 Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.036430 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.036393 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r7jt" event={"ID":"d06617bd-ff11-42b8-9b84-e856c8c3c9eb","Type":"ContainerDied","Data":"3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a"} Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.036481 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r7jt" event={"ID":"d06617bd-ff11-42b8-9b84-e856c8c3c9eb","Type":"ContainerDied","Data":"955eae43fa10fa99fdb4e5d4b56b2dccf5fef672fa575257d946cf0938c80e99"} Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.038087 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" event={"ID":"e00e505e-4736-4aee-b340-ef223d36cf41","Type":"ContainerDied","Data":"867551228a25bbc0bc0f926b4429da94d98dd14f3c9ae4a640117cf20bb2787e"} Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.038170 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.051291 4816 scope.go:117] "RemoveContainer" containerID="422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7" Mar 11 12:01:30 crc kubenswrapper[4816]: E0311 12:01:30.055545 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7\": container with ID starting with 422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7 not found: ID does not exist" containerID="422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.055613 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7"} err="failed to get container status \"422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7\": rpc error: code = NotFound desc = could not find container \"422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7\": container with ID starting with 422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7 not found: ID does not exist" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.055660 4816 scope.go:117] "RemoveContainer" containerID="3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.056768 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-568c67664d-76gf4"] Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.060348 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-568c67664d-76gf4"] Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.070667 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d06617bd-ff11-42b8-9b84-e856c8c3c9eb" (UID: "d06617bd-ff11-42b8-9b84-e856c8c3c9eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.077927 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk"] Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.082592 4816 scope.go:117] "RemoveContainer" containerID="278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.092672 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk"] Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.104714 4816 scope.go:117] "RemoveContainer" containerID="380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.106022 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.122001 4816 scope.go:117] "RemoveContainer" containerID="3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a" Mar 11 12:01:30 crc kubenswrapper[4816]: E0311 12:01:30.122768 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a\": container with ID starting with 3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a not found: ID does not exist" containerID="3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.122819 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a"} err="failed to get container status \"3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a\": rpc error: code = NotFound desc = could not find container \"3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a\": container with ID starting with 3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a not found: ID does not exist" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.122853 4816 scope.go:117] "RemoveContainer" containerID="278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a" Mar 11 12:01:30 crc kubenswrapper[4816]: E0311 12:01:30.123235 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a\": container with ID starting with 278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a not found: ID does not exist" containerID="278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.123500 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a"} err="failed to get container status \"278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a\": rpc error: code = NotFound desc = could not find container \"278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a\": container with ID starting with 278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a not found: ID does not exist" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.123545 4816 scope.go:117] "RemoveContainer" containerID="380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757" Mar 11 12:01:30 crc kubenswrapper[4816]: E0311 12:01:30.123990 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757\": container with ID starting with 380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757 not found: ID does not exist" containerID="380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.124025 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757"} err="failed to get container status \"380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757\": rpc error: code = NotFound desc = could not find container \"380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757\": container with ID starting with 380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757 not found: ID does not exist" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.124047 4816 scope.go:117] "RemoveContainer" containerID="c88e59606b391072deb5c308ca0b530083322e3151190cbbfebe8fb7af0870a2" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.140987 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eabc434-3f96-4124-9afc-ecb2466f2104" path="/var/lib/kubelet/pods/0eabc434-3f96-4124-9afc-ecb2466f2104/volumes" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.141727 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e00e505e-4736-4aee-b340-ef223d36cf41" path="/var/lib/kubelet/pods/e00e505e-4736-4aee-b340-ef223d36cf41/volumes" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.258147 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.375534 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8r7jt"] Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.378481 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8r7jt"] Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.483437 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw"] Mar 11 12:01:30 crc kubenswrapper[4816]: W0311 12:01:30.493940 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf862e1d6_c9a4_432c_b01f_610dac0371d6.slice/crio-b5aa380dfae5e5fb091865b30d4c19f63f255e625fdd2414ee2e29d633336c4c WatchSource:0}: Error finding container b5aa380dfae5e5fb091865b30d4c19f63f255e625fdd2414ee2e29d633336c4c: Status 404 returned error can't find the container with id b5aa380dfae5e5fb091865b30d4c19f63f255e625fdd2414ee2e29d633336c4c Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.000042 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.044902 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" event={"ID":"f862e1d6-c9a4-432c-b01f-610dac0371d6","Type":"ContainerStarted","Data":"1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9"} Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.044949 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" event={"ID":"f862e1d6-c9a4-432c-b01f-610dac0371d6","Type":"ContainerStarted","Data":"b5aa380dfae5e5fb091865b30d4c19f63f255e625fdd2414ee2e29d633336c4c"} Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.045169 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.047761 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.059462 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" podStartSLOduration=3.059448494 podStartE2EDuration="3.059448494s" podCreationTimestamp="2026-03-11 12:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:31.058561198 +0000 UTC m=+177.649825165" watchObservedRunningTime="2026-03-11 12:01:31.059448494 +0000 UTC m=+177.650712461" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.121061 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.222385 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.267866 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.398055 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.433733 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.101260 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67c5474778-rwg6j"] Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.102182 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.104795 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.105116 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.105115 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.105206 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.109870 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.112354 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.113089 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c5474778-rwg6j"] Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.116967 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.130805 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d6304f-5acd-48ff-9d06-b221c14f80fc-serving-cert\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.130836 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-proxy-ca-bundles\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.130861 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-client-ca\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.130880 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-config\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.130904 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxdtt\" (UniqueName: \"kubernetes.io/projected/67d6304f-5acd-48ff-9d06-b221c14f80fc-kube-api-access-cxdtt\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.137397 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" path="/var/lib/kubelet/pods/d06617bd-ff11-42b8-9b84-e856c8c3c9eb/volumes" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.232095 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d6304f-5acd-48ff-9d06-b221c14f80fc-serving-cert\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.232133 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-proxy-ca-bundles\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.232159 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-client-ca\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.232175 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-config\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.232201 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxdtt\" (UniqueName: \"kubernetes.io/projected/67d6304f-5acd-48ff-9d06-b221c14f80fc-kube-api-access-cxdtt\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.233788 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-client-ca\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.234266 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-proxy-ca-bundles\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.234349 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-config\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.243010 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d6304f-5acd-48ff-9d06-b221c14f80fc-serving-cert\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.251846 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxdtt\" (UniqueName: \"kubernetes.io/projected/67d6304f-5acd-48ff-9d06-b221c14f80fc-kube-api-access-cxdtt\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.437119 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.629901 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c5474778-rwg6j"] Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.972098 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.972169 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.015754 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.063786 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" event={"ID":"67d6304f-5acd-48ff-9d06-b221c14f80fc","Type":"ContainerStarted","Data":"64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56"} Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.063857 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" event={"ID":"67d6304f-5acd-48ff-9d06-b221c14f80fc","Type":"ContainerStarted","Data":"e62b23734f4605f0a4a63279799bd424ddf3142a70edb89c159c912c5c2f76f1"} Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.082742 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" podStartSLOduration=5.082722316 podStartE2EDuration="5.082722316s" podCreationTimestamp="2026-03-11 12:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:33.080180354 +0000 UTC m=+179.671444331" watchObservedRunningTime="2026-03-11 12:01:33.082722316 +0000 UTC m=+179.673986283" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.111206 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.225996 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2dh2"] Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.226193 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s2dh2" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="registry-server" containerID="cri-o://5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc" gracePeriod=2 Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.547362 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.649356 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-utilities\") pod \"756dd25b-5375-48bc-8578-a9585ef49e6c\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.649445 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-catalog-content\") pod \"756dd25b-5375-48bc-8578-a9585ef49e6c\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.649508 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsxwz\" (UniqueName: \"kubernetes.io/projected/756dd25b-5375-48bc-8578-a9585ef49e6c-kube-api-access-vsxwz\") pod \"756dd25b-5375-48bc-8578-a9585ef49e6c\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.650480 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-utilities" (OuterVolumeSpecName: "utilities") pod "756dd25b-5375-48bc-8578-a9585ef49e6c" (UID: "756dd25b-5375-48bc-8578-a9585ef49e6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.666416 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756dd25b-5375-48bc-8578-a9585ef49e6c-kube-api-access-vsxwz" (OuterVolumeSpecName: "kube-api-access-vsxwz") pod "756dd25b-5375-48bc-8578-a9585ef49e6c" (UID: "756dd25b-5375-48bc-8578-a9585ef49e6c"). InnerVolumeSpecName "kube-api-access-vsxwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.716069 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "756dd25b-5375-48bc-8578-a9585ef49e6c" (UID: "756dd25b-5375-48bc-8578-a9585ef49e6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.760406 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.760445 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsxwz\" (UniqueName: \"kubernetes.io/projected/756dd25b-5375-48bc-8578-a9585ef49e6c-kube-api-access-vsxwz\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.760456 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.070836 4816 generic.go:334] "Generic (PLEG): container finished" podID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerID="5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc" exitCode=0 Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.072150 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.074261 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2dh2" event={"ID":"756dd25b-5375-48bc-8578-a9585ef49e6c","Type":"ContainerDied","Data":"5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc"} Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.074329 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.074346 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2dh2" event={"ID":"756dd25b-5375-48bc-8578-a9585ef49e6c","Type":"ContainerDied","Data":"6c14169c95d372913c35e21942a8624231ab375807c0764494279b189e642cec"} Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.074372 4816 scope.go:117] "RemoveContainer" containerID="5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.077660 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.100040 4816 scope.go:117] "RemoveContainer" containerID="3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.134727 4816 scope.go:117] "RemoveContainer" containerID="c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.140626 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2dh2"] Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.140787 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s2dh2"] Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.152605 4816 scope.go:117] "RemoveContainer" containerID="5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc" Mar 11 12:01:34 crc kubenswrapper[4816]: E0311 12:01:34.153039 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc\": container with ID starting with 5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc not found: ID does not exist" containerID="5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.153198 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc"} err="failed to get container status \"5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc\": rpc error: code = NotFound desc = could not find container \"5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc\": container with ID starting with 5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc not found: ID does not exist" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.153380 4816 scope.go:117] "RemoveContainer" containerID="3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca" Mar 11 12:01:34 crc kubenswrapper[4816]: E0311 12:01:34.153839 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca\": container with ID starting with 3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca not found: ID does not exist" containerID="3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.153892 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca"} err="failed to get container status \"3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca\": rpc error: code = NotFound desc = could not find container \"3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca\": container with ID starting with 3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca not found: ID does not exist" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.153927 4816 scope.go:117] "RemoveContainer" containerID="c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac" Mar 11 12:01:34 crc kubenswrapper[4816]: E0311 12:01:34.154222 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac\": container with ID starting with c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac not found: ID does not exist" containerID="c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.154253 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac"} err="failed to get container status \"c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac\": rpc error: code = NotFound desc = could not find container \"c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac\": container with ID starting with c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac not found: ID does not exist" Mar 11 12:01:36 crc kubenswrapper[4816]: I0311 12:01:36.137082 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" path="/var/lib/kubelet/pods/756dd25b-5375-48bc-8578-a9585ef49e6c/volumes" Mar 11 12:01:40 crc kubenswrapper[4816]: I0311 12:01:40.975531 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" podUID="f0f288b8-4b39-42ac-9835-4fb118a86218" containerName="oauth-openshift" containerID="cri-o://e6147c8cc58ce3bae6f999bb1c2d0007faaa3cf350373703380a98dc3aa752bc" gracePeriod=15 Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.112649 4816 generic.go:334] "Generic (PLEG): container finished" podID="f0f288b8-4b39-42ac-9835-4fb118a86218" containerID="e6147c8cc58ce3bae6f999bb1c2d0007faaa3cf350373703380a98dc3aa752bc" exitCode=0 Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.112696 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" event={"ID":"f0f288b8-4b39-42ac-9835-4fb118a86218","Type":"ContainerDied","Data":"e6147c8cc58ce3bae6f999bb1c2d0007faaa3cf350373703380a98dc3aa752bc"} Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.404270 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.459829 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-dir\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460197 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-provider-selection\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460236 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-login\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460309 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-trusted-ca-bundle\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460342 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-cliconfig\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460370 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-error\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460416 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-ocp-branding-template\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460449 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-session\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460481 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-serving-cert\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460507 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-idp-0-file-data\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460533 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-service-ca\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460575 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-router-certs\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460606 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csr9c\" (UniqueName: \"kubernetes.io/projected/f0f288b8-4b39-42ac-9835-4fb118a86218-kube-api-access-csr9c\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460644 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-policies\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.461591 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.462437 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.462500 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.465466 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.465478 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.467070 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.467559 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.467921 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.468161 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.468711 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.468805 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.469399 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.470400 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f288b8-4b39-42ac-9835-4fb118a86218-kube-api-access-csr9c" (OuterVolumeSpecName: "kube-api-access-csr9c") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "kube-api-access-csr9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.473215 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.562605 4816 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.562981 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563076 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563194 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563356 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563455 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563545 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563635 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563724 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563811 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563896 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563974 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.564061 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csr9c\" (UniqueName: \"kubernetes.io/projected/f0f288b8-4b39-42ac-9835-4fb118a86218-kube-api-access-csr9c\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.564141 4816 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:42 crc kubenswrapper[4816]: I0311 12:01:42.117959 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" event={"ID":"f0f288b8-4b39-42ac-9835-4fb118a86218","Type":"ContainerDied","Data":"b0c7d7e2fb960d418e680c2e934ecd5f41d42c08329bcf33576579240438a243"} Mar 11 12:01:42 crc kubenswrapper[4816]: I0311 12:01:42.118015 4816 scope.go:117] "RemoveContainer" containerID="e6147c8cc58ce3bae6f999bb1c2d0007faaa3cf350373703380a98dc3aa752bc" Mar 11 12:01:42 crc kubenswrapper[4816]: I0311 12:01:42.118013 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:01:42 crc kubenswrapper[4816]: I0311 12:01:42.149219 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz2pp"] Mar 11 12:01:42 crc kubenswrapper[4816]: I0311 12:01:42.153078 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz2pp"] Mar 11 12:01:44 crc kubenswrapper[4816]: I0311 12:01:44.136533 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0f288b8-4b39-42ac-9835-4fb118a86218" path="/var/lib/kubelet/pods/f0f288b8-4b39-42ac-9835-4fb118a86218/volumes" Mar 11 12:01:48 crc kubenswrapper[4816]: I0311 12:01:48.502807 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c5474778-rwg6j"] Mar 11 12:01:48 crc kubenswrapper[4816]: I0311 12:01:48.504059 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" podUID="67d6304f-5acd-48ff-9d06-b221c14f80fc" containerName="controller-manager" containerID="cri-o://64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56" gracePeriod=30 Mar 11 12:01:48 crc kubenswrapper[4816]: I0311 12:01:48.599947 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw"] Mar 11 12:01:48 crc kubenswrapper[4816]: I0311 12:01:48.600453 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" podUID="f862e1d6-c9a4-432c-b01f-610dac0371d6" containerName="route-controller-manager" containerID="cri-o://1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9" gracePeriod=30 Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.024511 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.048679 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f862e1d6-c9a4-432c-b01f-610dac0371d6-serving-cert\") pod \"f862e1d6-c9a4-432c-b01f-610dac0371d6\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.048759 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-client-ca\") pod \"f862e1d6-c9a4-432c-b01f-610dac0371d6\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.048786 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-config\") pod \"f862e1d6-c9a4-432c-b01f-610dac0371d6\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.048818 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rxvx\" (UniqueName: \"kubernetes.io/projected/f862e1d6-c9a4-432c-b01f-610dac0371d6-kube-api-access-9rxvx\") pod \"f862e1d6-c9a4-432c-b01f-610dac0371d6\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.050465 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-client-ca" (OuterVolumeSpecName: "client-ca") pod "f862e1d6-c9a4-432c-b01f-610dac0371d6" (UID: "f862e1d6-c9a4-432c-b01f-610dac0371d6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.052433 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-config" (OuterVolumeSpecName: "config") pod "f862e1d6-c9a4-432c-b01f-610dac0371d6" (UID: "f862e1d6-c9a4-432c-b01f-610dac0371d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.054281 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f862e1d6-c9a4-432c-b01f-610dac0371d6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f862e1d6-c9a4-432c-b01f-610dac0371d6" (UID: "f862e1d6-c9a4-432c-b01f-610dac0371d6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.054567 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f862e1d6-c9a4-432c-b01f-610dac0371d6-kube-api-access-9rxvx" (OuterVolumeSpecName: "kube-api-access-9rxvx") pod "f862e1d6-c9a4-432c-b01f-610dac0371d6" (UID: "f862e1d6-c9a4-432c-b01f-610dac0371d6"). InnerVolumeSpecName "kube-api-access-9rxvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.087972 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.114315 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl"] Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.114715 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f288b8-4b39-42ac-9835-4fb118a86218" containerName="oauth-openshift" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.114780 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f288b8-4b39-42ac-9835-4fb118a86218" containerName="oauth-openshift" Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.114869 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f862e1d6-c9a4-432c-b01f-610dac0371d6" containerName="route-controller-manager" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.114932 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f862e1d6-c9a4-432c-b01f-610dac0371d6" containerName="route-controller-manager" Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.115003 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="registry-server" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.115071 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="registry-server" Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.115130 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d6304f-5acd-48ff-9d06-b221c14f80fc" containerName="controller-manager" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.115213 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d6304f-5acd-48ff-9d06-b221c14f80fc" containerName="controller-manager" Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.115291 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="extract-content" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.116007 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="extract-content" Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.116074 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="extract-utilities" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.116126 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="extract-utilities" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.116331 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d6304f-5acd-48ff-9d06-b221c14f80fc" containerName="controller-manager" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.116451 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f288b8-4b39-42ac-9835-4fb118a86218" containerName="oauth-openshift" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.116524 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f862e1d6-c9a4-432c-b01f-610dac0371d6" containerName="route-controller-manager" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.116587 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="registry-server" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.117200 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.120838 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.121049 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.121163 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.121370 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.121490 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.121588 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.123589 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl"] Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.126808 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.127138 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.127440 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.127594 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.128533 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.129887 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.132099 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.134859 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.136932 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150091 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d6304f-5acd-48ff-9d06-b221c14f80fc-serving-cert\") pod \"67d6304f-5acd-48ff-9d06-b221c14f80fc\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150171 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxdtt\" (UniqueName: \"kubernetes.io/projected/67d6304f-5acd-48ff-9d06-b221c14f80fc-kube-api-access-cxdtt\") pod \"67d6304f-5acd-48ff-9d06-b221c14f80fc\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150233 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-client-ca\") pod \"67d6304f-5acd-48ff-9d06-b221c14f80fc\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150306 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-config\") pod \"67d6304f-5acd-48ff-9d06-b221c14f80fc\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150370 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-proxy-ca-bundles\") pod \"67d6304f-5acd-48ff-9d06-b221c14f80fc\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150489 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150518 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-serving-cert\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150536 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-login\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150568 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-session\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150596 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-error\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150615 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2z59\" (UniqueName: \"kubernetes.io/projected/ea112c1f-2bbd-48bb-979e-980a6486f185-kube-api-access-c2z59\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150659 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-router-certs\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150675 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea112c1f-2bbd-48bb-979e-980a6486f185-audit-dir\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150692 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150715 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-service-ca\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150739 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150759 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150779 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-cliconfig\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150797 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-audit-policies\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150832 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f862e1d6-c9a4-432c-b01f-610dac0371d6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150843 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150852 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150913 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rxvx\" (UniqueName: \"kubernetes.io/projected/f862e1d6-c9a4-432c-b01f-610dac0371d6-kube-api-access-9rxvx\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.151546 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-client-ca" (OuterVolumeSpecName: "client-ca") pod "67d6304f-5acd-48ff-9d06-b221c14f80fc" (UID: "67d6304f-5acd-48ff-9d06-b221c14f80fc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.151568 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "67d6304f-5acd-48ff-9d06-b221c14f80fc" (UID: "67d6304f-5acd-48ff-9d06-b221c14f80fc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.151666 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-config" (OuterVolumeSpecName: "config") pod "67d6304f-5acd-48ff-9d06-b221c14f80fc" (UID: "67d6304f-5acd-48ff-9d06-b221c14f80fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.153318 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d6304f-5acd-48ff-9d06-b221c14f80fc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "67d6304f-5acd-48ff-9d06-b221c14f80fc" (UID: "67d6304f-5acd-48ff-9d06-b221c14f80fc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.153366 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d6304f-5acd-48ff-9d06-b221c14f80fc-kube-api-access-cxdtt" (OuterVolumeSpecName: "kube-api-access-cxdtt") pod "67d6304f-5acd-48ff-9d06-b221c14f80fc" (UID: "67d6304f-5acd-48ff-9d06-b221c14f80fc"). InnerVolumeSpecName "kube-api-access-cxdtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.160127 4816 generic.go:334] "Generic (PLEG): container finished" podID="67d6304f-5acd-48ff-9d06-b221c14f80fc" containerID="64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56" exitCode=0 Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.160188 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" event={"ID":"67d6304f-5acd-48ff-9d06-b221c14f80fc","Type":"ContainerDied","Data":"64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56"} Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.160216 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" event={"ID":"67d6304f-5acd-48ff-9d06-b221c14f80fc","Type":"ContainerDied","Data":"e62b23734f4605f0a4a63279799bd424ddf3142a70edb89c159c912c5c2f76f1"} Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.160235 4816 scope.go:117] "RemoveContainer" containerID="64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.160361 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.164805 4816 generic.go:334] "Generic (PLEG): container finished" podID="f862e1d6-c9a4-432c-b01f-610dac0371d6" containerID="1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9" exitCode=0 Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.164842 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" event={"ID":"f862e1d6-c9a4-432c-b01f-610dac0371d6","Type":"ContainerDied","Data":"1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9"} Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.164864 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" event={"ID":"f862e1d6-c9a4-432c-b01f-610dac0371d6","Type":"ContainerDied","Data":"b5aa380dfae5e5fb091865b30d4c19f63f255e625fdd2414ee2e29d633336c4c"} Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.164907 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.187275 4816 scope.go:117] "RemoveContainer" containerID="64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56" Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.187724 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56\": container with ID starting with 64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56 not found: ID does not exist" containerID="64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.187771 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56"} err="failed to get container status \"64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56\": rpc error: code = NotFound desc = could not find container \"64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56\": container with ID starting with 64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56 not found: ID does not exist" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.187798 4816 scope.go:117] "RemoveContainer" containerID="1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.188922 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c5474778-rwg6j"] Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.197312 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-67c5474778-rwg6j"] Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.201151 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw"] Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.203807 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw"] Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.206380 4816 scope.go:117] "RemoveContainer" containerID="1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9" Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.206855 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9\": container with ID starting with 1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9 not found: ID does not exist" containerID="1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.206888 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9"} err="failed to get container status \"1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9\": rpc error: code = NotFound desc = could not find container \"1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9\": container with ID starting with 1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9 not found: ID does not exist" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251565 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-router-certs\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251611 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea112c1f-2bbd-48bb-979e-980a6486f185-audit-dir\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251633 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251662 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-service-ca\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251693 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251714 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251733 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-cliconfig\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251750 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-audit-policies\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251766 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251783 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-serving-cert\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251800 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-login\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251828 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-session\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251858 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-error\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251875 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2z59\" (UniqueName: \"kubernetes.io/projected/ea112c1f-2bbd-48bb-979e-980a6486f185-kube-api-access-c2z59\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251908 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251920 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251929 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251938 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d6304f-5acd-48ff-9d06-b221c14f80fc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251948 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxdtt\" (UniqueName: \"kubernetes.io/projected/67d6304f-5acd-48ff-9d06-b221c14f80fc-kube-api-access-cxdtt\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.252931 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea112c1f-2bbd-48bb-979e-980a6486f185-audit-dir\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.253731 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-service-ca\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.253875 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-cliconfig\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.253976 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-audit-policies\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.253986 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.257112 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-error\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.257235 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-session\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.257363 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.257369 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.257403 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-login\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.257617 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-router-certs\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.257757 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.258104 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-serving-cert\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.267794 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2z59\" (UniqueName: \"kubernetes.io/projected/ea112c1f-2bbd-48bb-979e-980a6486f185-kube-api-access-c2z59\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.441054 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.812601 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl"] Mar 11 12:01:49 crc kubenswrapper[4816]: W0311 12:01:49.818019 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea112c1f_2bbd_48bb_979e_980a6486f185.slice/crio-b4e06439659818c228d6a9bff54daaf6207e7a57e6f0d975ac8144ca723669c6 WatchSource:0}: Error finding container b4e06439659818c228d6a9bff54daaf6207e7a57e6f0d975ac8144ca723669c6: Status 404 returned error can't find the container with id b4e06439659818c228d6a9bff54daaf6207e7a57e6f0d975ac8144ca723669c6 Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.126565 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-585b8644c9-vg9hh"] Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.128302 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.130889 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.131380 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.133360 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.133652 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.134373 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.134607 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.145842 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.146764 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d6304f-5acd-48ff-9d06-b221c14f80fc" path="/var/lib/kubelet/pods/67d6304f-5acd-48ff-9d06-b221c14f80fc/volumes" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.147578 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f862e1d6-c9a4-432c-b01f-610dac0371d6" path="/var/lib/kubelet/pods/f862e1d6-c9a4-432c-b01f-610dac0371d6/volumes" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.148873 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7"] Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.149655 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-585b8644c9-vg9hh"] Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.149683 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7"] Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.149761 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.152301 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.152322 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.152819 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.153415 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.153516 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.153421 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.163431 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-config\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.163468 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-client-ca\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.163525 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-proxy-ca-bundles\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.163553 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhgjq\" (UniqueName: \"kubernetes.io/projected/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-kube-api-access-hhgjq\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.163572 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-serving-cert\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.170903 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" event={"ID":"ea112c1f-2bbd-48bb-979e-980a6486f185","Type":"ContainerStarted","Data":"47232ae5d93011aab131314d06cce85fb43c74c36f99bff53e4b62955cbb1144"} Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.170943 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" event={"ID":"ea112c1f-2bbd-48bb-979e-980a6486f185","Type":"ContainerStarted","Data":"b4e06439659818c228d6a9bff54daaf6207e7a57e6f0d975ac8144ca723669c6"} Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.171080 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.188802 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" podStartSLOduration=35.188780354 podStartE2EDuration="35.188780354s" podCreationTimestamp="2026-03-11 12:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:50.187652962 +0000 UTC m=+196.778916929" watchObservedRunningTime="2026-03-11 12:01:50.188780354 +0000 UTC m=+196.780044331" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.264615 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-serving-cert\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.264769 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8rg2\" (UniqueName: \"kubernetes.io/projected/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-kube-api-access-s8rg2\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.264805 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-config\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.264848 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-client-ca\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.264928 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-proxy-ca-bundles\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.264988 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhgjq\" (UniqueName: \"kubernetes.io/projected/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-kube-api-access-hhgjq\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.265012 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-config\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.265032 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-serving-cert\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.265053 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-client-ca\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.266413 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-proxy-ca-bundles\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.266966 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-client-ca\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.267168 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-config\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.276094 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-serving-cert\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.282720 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhgjq\" (UniqueName: \"kubernetes.io/projected/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-kube-api-access-hhgjq\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.366614 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-config\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.366667 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-client-ca\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.366727 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-serving-cert\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.366767 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8rg2\" (UniqueName: \"kubernetes.io/projected/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-kube-api-access-s8rg2\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.367984 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-client-ca\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.368066 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-config\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.369653 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-serving-cert\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.381674 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8rg2\" (UniqueName: \"kubernetes.io/projected/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-kube-api-access-s8rg2\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.441918 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.467867 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.560018 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.833856 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-585b8644c9-vg9hh"] Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.934673 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7"] Mar 11 12:01:50 crc kubenswrapper[4816]: W0311 12:01:50.940789 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5be5607d_c6a3_4ccd_9e3f_99c57bc38d7b.slice/crio-17ec4d81d1c2fe5b14d8c9342e15ad31dceb380c8d8f58ebd7ba64bc4c6d70d2 WatchSource:0}: Error finding container 17ec4d81d1c2fe5b14d8c9342e15ad31dceb380c8d8f58ebd7ba64bc4c6d70d2: Status 404 returned error can't find the container with id 17ec4d81d1c2fe5b14d8c9342e15ad31dceb380c8d8f58ebd7ba64bc4c6d70d2 Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.180357 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" event={"ID":"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b","Type":"ContainerStarted","Data":"7ded85d5e7523ff25bd756f7359889fd4e2bbcc50c9aa1df594c8d47c53c49fa"} Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.180651 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.180671 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" event={"ID":"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b","Type":"ContainerStarted","Data":"17ec4d81d1c2fe5b14d8c9342e15ad31dceb380c8d8f58ebd7ba64bc4c6d70d2"} Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.181699 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" event={"ID":"5e386f67-a816-4b53-b39a-5db0f6dfbc2a","Type":"ContainerStarted","Data":"05be18498e3eb84ca9646678a7178cc7af3d42649e2ab55a6755f06ad29010c3"} Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.181736 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" event={"ID":"5e386f67-a816-4b53-b39a-5db0f6dfbc2a","Type":"ContainerStarted","Data":"963822b7511d0122298da214e4a8ce92cb7fa4e421d6c14b5e157cabb6d5d894"} Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.181999 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.190009 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.197903 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" podStartSLOduration=3.197888427 podStartE2EDuration="3.197888427s" podCreationTimestamp="2026-03-11 12:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:51.194403308 +0000 UTC m=+197.785667265" watchObservedRunningTime="2026-03-11 12:01:51.197888427 +0000 UTC m=+197.789152394" Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.220120 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" podStartSLOduration=3.220103933 podStartE2EDuration="3.220103933s" podCreationTimestamp="2026-03-11 12:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:51.219448244 +0000 UTC m=+197.810712201" watchObservedRunningTime="2026-03-11 12:01:51.220103933 +0000 UTC m=+197.811367900" Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.443394 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.936223 4816 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.937222 4816 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.937355 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.937510 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46" gracePeriod=15 Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.937536 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56" gracePeriod=15 Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.937590 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://964c09610d05fa085a4adc7f7d902f67376a9168848e403cd849cfc2290dc26d" gracePeriod=15 Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.937590 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594" gracePeriod=15 Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.937604 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3" gracePeriod=15 Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938384 4816 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938601 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938617 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938625 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938633 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938641 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938648 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938658 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938663 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938670 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938677 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938692 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938698 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938712 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938717 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938727 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938733 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938835 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938848 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938859 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938869 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938876 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938884 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938891 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938979 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938986 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938993 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938999 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.939124 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.939135 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.980113 4816 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.94:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005673 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005720 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005752 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005777 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005798 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005816 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005839 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005859 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107526 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107607 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107641 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107653 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107696 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107714 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107790 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107849 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107868 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107895 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107914 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107969 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107986 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.108016 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.108038 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.108056 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.192727 4816 generic.go:334] "Generic (PLEG): container finished" podID="106a80c4-7132-43b4-930f-bd886787437f" containerID="13cc1621a3a1352dc36083505ef9245a833ca0fab13f1b74079c751c4ed90659" exitCode=0 Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.192789 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"106a80c4-7132-43b4-930f-bd886787437f","Type":"ContainerDied","Data":"13cc1621a3a1352dc36083505ef9245a833ca0fab13f1b74079c751c4ed90659"} Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.193214 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.193457 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.196535 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.198152 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.198791 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="964c09610d05fa085a4adc7f7d902f67376a9168848e403cd849cfc2290dc26d" exitCode=0 Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.198809 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56" exitCode=0 Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.198820 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594" exitCode=0 Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.198827 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3" exitCode=2 Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.199399 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.280908 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: W0311 12:01:53.312752 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-90fc4142aaecf9c3b788b3691ff3cbfa63e2833d66c3cb5a40fe3a191416e240 WatchSource:0}: Error finding container 90fc4142aaecf9c3b788b3691ff3cbfa63e2833d66c3cb5a40fe3a191416e240: Status 404 returned error can't find the container with id 90fc4142aaecf9c3b788b3691ff3cbfa63e2833d66c3cb5a40fe3a191416e240 Mar 11 12:01:53 crc kubenswrapper[4816]: E0311 12:01:53.319007 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.94:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189bc7c1c32a9eef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 12:01:53.317388015 +0000 UTC m=+199.908651982,LastTimestamp:2026-03-11 12:01:53.317388015 +0000 UTC m=+199.908651982,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.588513 4816 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.588594 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.132418 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.132745 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.207756 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a"} Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.207843 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"90fc4142aaecf9c3b788b3691ff3cbfa63e2833d66c3cb5a40fe3a191416e240"} Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.210107 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:54 crc kubenswrapper[4816]: E0311 12:01:54.210338 4816 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.94:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.212960 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.527754 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.528636 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.632235 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-var-lock\") pod \"106a80c4-7132-43b4-930f-bd886787437f\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.632326 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/106a80c4-7132-43b4-930f-bd886787437f-kube-api-access\") pod \"106a80c4-7132-43b4-930f-bd886787437f\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.632352 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-kubelet-dir\") pod \"106a80c4-7132-43b4-930f-bd886787437f\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.632351 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-var-lock" (OuterVolumeSpecName: "var-lock") pod "106a80c4-7132-43b4-930f-bd886787437f" (UID: "106a80c4-7132-43b4-930f-bd886787437f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.632479 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "106a80c4-7132-43b4-930f-bd886787437f" (UID: "106a80c4-7132-43b4-930f-bd886787437f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.632664 4816 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-var-lock\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.632684 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.636899 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106a80c4-7132-43b4-930f-bd886787437f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "106a80c4-7132-43b4-930f-bd886787437f" (UID: "106a80c4-7132-43b4-930f-bd886787437f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.734285 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/106a80c4-7132-43b4-930f-bd886787437f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.222359 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.222345 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"106a80c4-7132-43b4-930f-bd886787437f","Type":"ContainerDied","Data":"b87f445ca27d573faee92ddd515c624b2b710e714f620c36718ab43fc1a2134f"} Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.222778 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b87f445ca27d573faee92ddd515c624b2b710e714f620c36718ab43fc1a2134f" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.225293 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.225961 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46" exitCode=0 Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.237652 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.709649 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.710582 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.711431 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.711929 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.748197 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.748343 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.748577 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.748666 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.748672 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.749021 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.749045 4816 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.749290 4816 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:55 crc kubenswrapper[4816]: E0311 12:01:55.798422 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: E0311 12:01:55.798777 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: E0311 12:01:55.799133 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: E0311 12:01:55.799497 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: E0311 12:01:55.799867 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.800195 4816 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 11 12:01:55 crc kubenswrapper[4816]: E0311 12:01:55.801391 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="200ms" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.850601 4816 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:56 crc kubenswrapper[4816]: E0311 12:01:56.001732 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="400ms" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.138822 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.235800 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.236415 4816 scope.go:117] "RemoveContainer" containerID="964c09610d05fa085a4adc7f7d902f67376a9168848e403cd849cfc2290dc26d" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.236556 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.237311 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.237513 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.240697 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.241018 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.256573 4816 scope.go:117] "RemoveContainer" containerID="f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.268122 4816 scope.go:117] "RemoveContainer" containerID="c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.284148 4816 scope.go:117] "RemoveContainer" containerID="6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.297178 4816 scope.go:117] "RemoveContainer" containerID="c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.312320 4816 scope.go:117] "RemoveContainer" containerID="789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38" Mar 11 12:01:56 crc kubenswrapper[4816]: E0311 12:01:56.403203 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="800ms" Mar 11 12:01:57 crc kubenswrapper[4816]: E0311 12:01:57.203951 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="1.6s" Mar 11 12:01:57 crc kubenswrapper[4816]: E0311 12:01:57.595273 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.94:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189bc7c1c32a9eef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 12:01:53.317388015 +0000 UTC m=+199.908651982,LastTimestamp:2026-03-11 12:01:53.317388015 +0000 UTC m=+199.908651982,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 12:01:58 crc kubenswrapper[4816]: E0311 12:01:58.805441 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="3.2s" Mar 11 12:02:02 crc kubenswrapper[4816]: E0311 12:02:02.006236 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="6.4s" Mar 11 12:02:04 crc kubenswrapper[4816]: I0311 12:02:04.130024 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:04 crc kubenswrapper[4816]: I0311 12:02:04.136909 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:02:04 crc kubenswrapper[4816]: I0311 12:02:04.138270 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:02:04 crc kubenswrapper[4816]: I0311 12:02:04.149920 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:04 crc kubenswrapper[4816]: I0311 12:02:04.149951 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:04 crc kubenswrapper[4816]: E0311 12:02:04.150357 4816 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:04 crc kubenswrapper[4816]: I0311 12:02:04.151092 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:04 crc kubenswrapper[4816]: W0311 12:02:04.172491 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-4f610d8ade4556e966386e3da3d1fe7e8b8c2ec47fc3a13b82fdf9f538c3eda0 WatchSource:0}: Error finding container 4f610d8ade4556e966386e3da3d1fe7e8b8c2ec47fc3a13b82fdf9f538c3eda0: Status 404 returned error can't find the container with id 4f610d8ade4556e966386e3da3d1fe7e8b8c2ec47fc3a13b82fdf9f538c3eda0 Mar 11 12:02:04 crc kubenswrapper[4816]: I0311 12:02:04.289140 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4f610d8ade4556e966386e3da3d1fe7e8b8c2ec47fc3a13b82fdf9f538c3eda0"} Mar 11 12:02:05 crc kubenswrapper[4816]: I0311 12:02:05.296763 4816 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="bf0572f97888022c815137b73179893aed925bed6d8fd477a66e6a0e36c3abd2" exitCode=0 Mar 11 12:02:05 crc kubenswrapper[4816]: I0311 12:02:05.296839 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"bf0572f97888022c815137b73179893aed925bed6d8fd477a66e6a0e36c3abd2"} Mar 11 12:02:05 crc kubenswrapper[4816]: I0311 12:02:05.297199 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:05 crc kubenswrapper[4816]: I0311 12:02:05.297232 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:05 crc kubenswrapper[4816]: E0311 12:02:05.297805 4816 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:05 crc kubenswrapper[4816]: I0311 12:02:05.297924 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:02:06 crc kubenswrapper[4816]: I0311 12:02:06.306370 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"457ece82db84f0b7f8d8eeffb7fa8a017dfd31e1afd92d2ceceab272dc7da47f"} Mar 11 12:02:06 crc kubenswrapper[4816]: I0311 12:02:06.306668 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2422d8f7d6c84293d4f1c6c383181086d1b57d6cc94e90b0c097eb1178ab65a4"} Mar 11 12:02:06 crc kubenswrapper[4816]: I0311 12:02:06.306679 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b64874309994a7a7ce05a7c55bd76eb410d35646b17031929be8e9fff6fc32cd"} Mar 11 12:02:06 crc kubenswrapper[4816]: I0311 12:02:06.306687 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8cb5937837be06bc5590391cc325c6ef33b8a084057540c4b60a6891879630c5"} Mar 11 12:02:07 crc kubenswrapper[4816]: I0311 12:02:07.319294 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"27b8cff796af4e02c51a8468aa35452cb9e89bae3e503ba9531778f54c82e163"} Mar 11 12:02:07 crc kubenswrapper[4816]: I0311 12:02:07.319519 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:07 crc kubenswrapper[4816]: I0311 12:02:07.319731 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:07 crc kubenswrapper[4816]: I0311 12:02:07.319767 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:08 crc kubenswrapper[4816]: I0311 12:02:08.328869 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 11 12:02:08 crc kubenswrapper[4816]: I0311 12:02:08.330020 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 11 12:02:08 crc kubenswrapper[4816]: I0311 12:02:08.330096 4816 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="4cc365d25b754728795b200a155fa9bd64393ac8ec89f832fb06fc0f17e72cb5" exitCode=1 Mar 11 12:02:08 crc kubenswrapper[4816]: I0311 12:02:08.330146 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"4cc365d25b754728795b200a155fa9bd64393ac8ec89f832fb06fc0f17e72cb5"} Mar 11 12:02:08 crc kubenswrapper[4816]: I0311 12:02:08.330851 4816 scope.go:117] "RemoveContainer" containerID="4cc365d25b754728795b200a155fa9bd64393ac8ec89f832fb06fc0f17e72cb5" Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.151304 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.151376 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.157319 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.341635 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.342375 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.342458 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b392c9212fce86344701d69479c22395c3f78b0384e4f9171f781a1c50cb91f4"} Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.515562 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.516063 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:02:12 crc kubenswrapper[4816]: I0311 12:02:12.330339 4816 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:12 crc kubenswrapper[4816]: I0311 12:02:12.359739 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:12 crc kubenswrapper[4816]: I0311 12:02:12.359773 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:12 crc kubenswrapper[4816]: I0311 12:02:12.363376 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:13 crc kubenswrapper[4816]: I0311 12:02:13.366130 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:13 crc kubenswrapper[4816]: I0311 12:02:13.366538 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:14 crc kubenswrapper[4816]: I0311 12:02:14.144066 4816 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e5be8a02-8a6a-405e-b631-53e3c73dbc0f" Mar 11 12:02:14 crc kubenswrapper[4816]: I0311 12:02:14.462548 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 12:02:14 crc kubenswrapper[4816]: I0311 12:02:14.463184 4816 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 11 12:02:14 crc kubenswrapper[4816]: I0311 12:02:14.463229 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 11 12:02:17 crc kubenswrapper[4816]: I0311 12:02:17.142524 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 12:02:18 crc kubenswrapper[4816]: I0311 12:02:18.771615 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 11 12:02:19 crc kubenswrapper[4816]: I0311 12:02:19.479478 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 11 12:02:22 crc kubenswrapper[4816]: I0311 12:02:22.129580 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 11 12:02:22 crc kubenswrapper[4816]: I0311 12:02:22.729128 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 11 12:02:22 crc kubenswrapper[4816]: I0311 12:02:22.916524 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 12:02:23 crc kubenswrapper[4816]: I0311 12:02:23.179000 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 11 12:02:23 crc kubenswrapper[4816]: I0311 12:02:23.273115 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 11 12:02:23 crc kubenswrapper[4816]: I0311 12:02:23.420669 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 11 12:02:23 crc kubenswrapper[4816]: I0311 12:02:23.428803 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 11 12:02:23 crc kubenswrapper[4816]: I0311 12:02:23.722794 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 12:02:23 crc kubenswrapper[4816]: I0311 12:02:23.871498 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 11 12:02:23 crc kubenswrapper[4816]: I0311 12:02:23.940523 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.136627 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.198938 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.252720 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.395327 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.447710 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.465449 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.471141 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.488944 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.799083 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.817791 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.956140 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.033385 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.045005 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.101545 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.127315 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.284051 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.446685 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.477010 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.821717 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.856471 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.162718 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.262274 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.329695 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.385556 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.630209 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.638825 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.695318 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.727845 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.739988 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.873962 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.905323 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.914170 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.985863 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.126085 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.138200 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.210365 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.280482 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.409211 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.573608 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.582319 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.603221 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.912624 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.029495 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.139200 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.155518 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.164740 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.174830 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.250510 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.277776 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.354703 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.506227 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.674024 4816 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.674933 4816 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.708236 4816 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.744548 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.776982 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.845446 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.880073 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.936542 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.013437 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.125066 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.244061 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.273997 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.321877 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.365420 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.373923 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.441751 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.491715 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.581947 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.582039 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.656124 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.668192 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.682909 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.709894 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.762988 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.767739 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.867969 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.899194 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.952811 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.968076 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.976393 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.079144 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.094933 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.100744 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.300716 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.346822 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.383617 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.468164 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.593459 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.593488 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.703853 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.728714 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.766588 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.772204 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.858044 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.901120 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.955418 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.985383 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.999885 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.022217 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.058493 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.082651 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.089767 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.128215 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.259585 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.283025 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.292281 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.293239 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.335496 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.397268 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.415980 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.429183 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.480139 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.646674 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.674919 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.677052 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.699926 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.771911 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.842857 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.854822 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.935810 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.014972 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.017021 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.067906 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.187478 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.191941 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.194410 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.204732 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.262884 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.421592 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.471454 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.497978 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.522978 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.596392 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.601923 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.612554 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.646788 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.648945 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.683673 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.738927 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.796843 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.851301 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.181429 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.215998 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.219529 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.254431 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.283503 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.402888 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.425324 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.447888 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.474049 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.560337 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.628127 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.647045 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.681961 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.716215 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.775700 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.853585 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.863049 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.885622 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.927985 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.090646 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.094538 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.161727 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.167678 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.295959 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.333837 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.337614 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.367989 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.431594 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.441176 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.441348 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.466506 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.469995 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.692823 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.727774 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.794264 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.828581 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.847934 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.877127 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.902997 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.926098 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.977512 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.978533 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.203747 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.208798 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.225074 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.256645 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.323086 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.418168 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.706609 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.747501 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.777148 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.780111 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.850861 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.865525 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.920664 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.033798 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.038901 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.098147 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.228438 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.278696 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.332001 4816 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.344831 4816 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.526284 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.597079 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.619937 4816 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.624444 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.624501 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.628391 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.645129 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.645113139 podStartE2EDuration="24.645113139s" podCreationTimestamp="2026-03-11 12:02:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:02:36.644752129 +0000 UTC m=+243.236016096" watchObservedRunningTime="2026-03-11 12:02:36.645113139 +0000 UTC m=+243.236377106" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.856606 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.868305 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.923005 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.960324 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.965838 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.051925 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.123427 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.137946 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.154501 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.230140 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.300287 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.308340 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.457590 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.512202 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.696070 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.705665 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.745816 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.777043 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.791177 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.862976 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.870088 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 11 12:02:38 crc kubenswrapper[4816]: I0311 12:02:38.329147 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 12:02:38 crc kubenswrapper[4816]: I0311 12:02:38.352889 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 12:02:38 crc kubenswrapper[4816]: I0311 12:02:38.407767 4816 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 11 12:02:38 crc kubenswrapper[4816]: I0311 12:02:38.449843 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 11 12:02:38 crc kubenswrapper[4816]: I0311 12:02:38.688454 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.172028 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.199184 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.222072 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.325760 4816 ???:1] "http: TLS handshake error from 192.168.126.11:59192: no serving certificate available for the kubelet" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.514919 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.514984 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.533820 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553842-6xkxh"] Mar 11 12:02:39 crc kubenswrapper[4816]: E0311 12:02:39.534050 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106a80c4-7132-43b4-930f-bd886787437f" containerName="installer" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.534065 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="106a80c4-7132-43b4-930f-bd886787437f" containerName="installer" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.534183 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="106a80c4-7132-43b4-930f-bd886787437f" containerName="installer" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.534552 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.536588 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.536626 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.537320 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.543130 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553842-6xkxh"] Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.647447 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcdsm\" (UniqueName: \"kubernetes.io/projected/ba5c6602-69d6-46be-a23b-fb4d6290a974-kube-api-access-vcdsm\") pod \"auto-csr-approver-29553842-6xkxh\" (UID: \"ba5c6602-69d6-46be-a23b-fb4d6290a974\") " pod="openshift-infra/auto-csr-approver-29553842-6xkxh" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.748657 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcdsm\" (UniqueName: \"kubernetes.io/projected/ba5c6602-69d6-46be-a23b-fb4d6290a974-kube-api-access-vcdsm\") pod \"auto-csr-approver-29553842-6xkxh\" (UID: \"ba5c6602-69d6-46be-a23b-fb4d6290a974\") " pod="openshift-infra/auto-csr-approver-29553842-6xkxh" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.766481 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcdsm\" (UniqueName: \"kubernetes.io/projected/ba5c6602-69d6-46be-a23b-fb4d6290a974-kube-api-access-vcdsm\") pod \"auto-csr-approver-29553842-6xkxh\" (UID: \"ba5c6602-69d6-46be-a23b-fb4d6290a974\") " pod="openshift-infra/auto-csr-approver-29553842-6xkxh" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.840100 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.849507 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" Mar 11 12:02:40 crc kubenswrapper[4816]: I0311 12:02:40.220397 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553842-6xkxh"] Mar 11 12:02:40 crc kubenswrapper[4816]: I0311 12:02:40.245854 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 11 12:02:40 crc kubenswrapper[4816]: I0311 12:02:40.250380 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 11 12:02:40 crc kubenswrapper[4816]: I0311 12:02:40.378762 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 11 12:02:40 crc kubenswrapper[4816]: I0311 12:02:40.512277 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" event={"ID":"ba5c6602-69d6-46be-a23b-fb4d6290a974","Type":"ContainerStarted","Data":"e209630d2c6430ce88eea44392e08d1ea6502f314f9a6b9b81af4242ac59ed97"} Mar 11 12:02:41 crc kubenswrapper[4816]: I0311 12:02:41.589260 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 11 12:02:42 crc kubenswrapper[4816]: I0311 12:02:42.403633 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 11 12:02:42 crc kubenswrapper[4816]: I0311 12:02:42.636566 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 11 12:02:45 crc kubenswrapper[4816]: I0311 12:02:45.946812 4816 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 12:02:45 crc kubenswrapper[4816]: I0311 12:02:45.947144 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a" gracePeriod=5 Mar 11 12:02:46 crc kubenswrapper[4816]: I0311 12:02:46.531494 4816 csr.go:261] certificate signing request csr-mqscd is approved, waiting to be issued Mar 11 12:02:46 crc kubenswrapper[4816]: I0311 12:02:46.554133 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" event={"ID":"ba5c6602-69d6-46be-a23b-fb4d6290a974","Type":"ContainerStarted","Data":"2cfac82b0530dfec9409f269e3ee40d7a556b84403bec8f94f82329b0208a810"} Mar 11 12:02:46 crc kubenswrapper[4816]: I0311 12:02:46.556147 4816 csr.go:257] certificate signing request csr-mqscd is issued Mar 11 12:02:46 crc kubenswrapper[4816]: I0311 12:02:46.570059 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" podStartSLOduration=8.836376387 podStartE2EDuration="14.570039008s" podCreationTimestamp="2026-03-11 12:02:32 +0000 UTC" firstStartedPulling="2026-03-11 12:02:40.23119362 +0000 UTC m=+246.822457587" lastFinishedPulling="2026-03-11 12:02:45.964856241 +0000 UTC m=+252.556120208" observedRunningTime="2026-03-11 12:02:46.567514595 +0000 UTC m=+253.158778562" watchObservedRunningTime="2026-03-11 12:02:46.570039008 +0000 UTC m=+253.161302975" Mar 11 12:02:47 crc kubenswrapper[4816]: I0311 12:02:47.558058 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-04 14:38:11.924945389 +0000 UTC Mar 11 12:02:47 crc kubenswrapper[4816]: I0311 12:02:47.558356 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6434h35m24.366592592s for next certificate rotation Mar 11 12:02:47 crc kubenswrapper[4816]: I0311 12:02:47.560483 4816 generic.go:334] "Generic (PLEG): container finished" podID="ba5c6602-69d6-46be-a23b-fb4d6290a974" containerID="2cfac82b0530dfec9409f269e3ee40d7a556b84403bec8f94f82329b0208a810" exitCode=0 Mar 11 12:02:47 crc kubenswrapper[4816]: I0311 12:02:47.560520 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" event={"ID":"ba5c6602-69d6-46be-a23b-fb4d6290a974","Type":"ContainerDied","Data":"2cfac82b0530dfec9409f269e3ee40d7a556b84403bec8f94f82329b0208a810"} Mar 11 12:02:48 crc kubenswrapper[4816]: I0311 12:02:48.558961 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-15 01:29:14.479290278 +0000 UTC Mar 11 12:02:48 crc kubenswrapper[4816]: I0311 12:02:48.559014 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7429h26m25.920279579s for next certificate rotation Mar 11 12:02:48 crc kubenswrapper[4816]: I0311 12:02:48.804585 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" Mar 11 12:02:48 crc kubenswrapper[4816]: I0311 12:02:48.976116 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcdsm\" (UniqueName: \"kubernetes.io/projected/ba5c6602-69d6-46be-a23b-fb4d6290a974-kube-api-access-vcdsm\") pod \"ba5c6602-69d6-46be-a23b-fb4d6290a974\" (UID: \"ba5c6602-69d6-46be-a23b-fb4d6290a974\") " Mar 11 12:02:48 crc kubenswrapper[4816]: I0311 12:02:48.983187 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba5c6602-69d6-46be-a23b-fb4d6290a974-kube-api-access-vcdsm" (OuterVolumeSpecName: "kube-api-access-vcdsm") pod "ba5c6602-69d6-46be-a23b-fb4d6290a974" (UID: "ba5c6602-69d6-46be-a23b-fb4d6290a974"). InnerVolumeSpecName "kube-api-access-vcdsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:02:49 crc kubenswrapper[4816]: I0311 12:02:49.077606 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcdsm\" (UniqueName: \"kubernetes.io/projected/ba5c6602-69d6-46be-a23b-fb4d6290a974-kube-api-access-vcdsm\") on node \"crc\" DevicePath \"\"" Mar 11 12:02:49 crc kubenswrapper[4816]: I0311 12:02:49.572989 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" event={"ID":"ba5c6602-69d6-46be-a23b-fb4d6290a974","Type":"ContainerDied","Data":"e209630d2c6430ce88eea44392e08d1ea6502f314f9a6b9b81af4242ac59ed97"} Mar 11 12:02:49 crc kubenswrapper[4816]: I0311 12:02:49.573041 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e209630d2c6430ce88eea44392e08d1ea6502f314f9a6b9b81af4242ac59ed97" Mar 11 12:02:49 crc kubenswrapper[4816]: I0311 12:02:49.573119 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.517094 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.517417 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.582271 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.582314 4816 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a" exitCode=137 Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.582357 4816 scope.go:117] "RemoveContainer" containerID="010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.582394 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.596668 4816 scope.go:117] "RemoveContainer" containerID="010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a" Mar 11 12:02:51 crc kubenswrapper[4816]: E0311 12:02:51.597191 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a\": container with ID starting with 010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a not found: ID does not exist" containerID="010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.597270 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a"} err="failed to get container status \"010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a\": rpc error: code = NotFound desc = could not find container \"010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a\": container with ID starting with 010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a not found: ID does not exist" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.608668 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.608714 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.608770 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.608817 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.608839 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.609089 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.609130 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.609154 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.609175 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.614413 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.710267 4816 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.710307 4816 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.710320 4816 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.710332 4816 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.710346 4816 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:02:52 crc kubenswrapper[4816]: I0311 12:02:52.137546 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 11 12:03:01 crc kubenswrapper[4816]: I0311 12:03:01.638044 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerID="8656da7afe12612a591590b5842a75afd40668d9dd72d7b01fcb55c35787a0e1" exitCode=0 Mar 11 12:03:01 crc kubenswrapper[4816]: I0311 12:03:01.638173 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" event={"ID":"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6","Type":"ContainerDied","Data":"8656da7afe12612a591590b5842a75afd40668d9dd72d7b01fcb55c35787a0e1"} Mar 11 12:03:01 crc kubenswrapper[4816]: I0311 12:03:01.638902 4816 scope.go:117] "RemoveContainer" containerID="8656da7afe12612a591590b5842a75afd40668d9dd72d7b01fcb55c35787a0e1" Mar 11 12:03:02 crc kubenswrapper[4816]: I0311 12:03:02.646737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" event={"ID":"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6","Type":"ContainerStarted","Data":"a866a70345c8481e25e2f58460d24a8a5c95dd9260f5acf809685fe8295ea5eb"} Mar 11 12:03:02 crc kubenswrapper[4816]: I0311 12:03:02.647149 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:03:02 crc kubenswrapper[4816]: I0311 12:03:02.648406 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:03:09 crc kubenswrapper[4816]: I0311 12:03:09.514957 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:03:09 crc kubenswrapper[4816]: I0311 12:03:09.515694 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:03:09 crc kubenswrapper[4816]: I0311 12:03:09.515764 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:03:09 crc kubenswrapper[4816]: I0311 12:03:09.516548 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:03:09 crc kubenswrapper[4816]: I0311 12:03:09.516605 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2" gracePeriod=600 Mar 11 12:03:09 crc kubenswrapper[4816]: I0311 12:03:09.684511 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2" exitCode=0 Mar 11 12:03:09 crc kubenswrapper[4816]: I0311 12:03:09.684592 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2"} Mar 11 12:03:10 crc kubenswrapper[4816]: I0311 12:03:10.692466 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"a1d4bcf565d8188182640fd6dfae19dbf3e118e4747e8a92039a28e6b5c3c95c"} Mar 11 12:03:54 crc kubenswrapper[4816]: I0311 12:03:54.977177 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p4bcz"] Mar 11 12:03:54 crc kubenswrapper[4816]: E0311 12:03:54.977929 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 12:03:54 crc kubenswrapper[4816]: I0311 12:03:54.977941 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 12:03:54 crc kubenswrapper[4816]: E0311 12:03:54.977965 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba5c6602-69d6-46be-a23b-fb4d6290a974" containerName="oc" Mar 11 12:03:54 crc kubenswrapper[4816]: I0311 12:03:54.977972 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba5c6602-69d6-46be-a23b-fb4d6290a974" containerName="oc" Mar 11 12:03:54 crc kubenswrapper[4816]: I0311 12:03:54.978062 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba5c6602-69d6-46be-a23b-fb4d6290a974" containerName="oc" Mar 11 12:03:54 crc kubenswrapper[4816]: I0311 12:03:54.978074 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 12:03:54 crc kubenswrapper[4816]: I0311 12:03:54.978469 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.019748 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p4bcz"] Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105547 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-registry-tls\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105596 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cb18b94-7487-4088-9435-6c312a8727c0-trusted-ca\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105623 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3cb18b94-7487-4088-9435-6c312a8727c0-registry-certificates\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105733 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9jkk\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-kube-api-access-q9jkk\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105785 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105827 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3cb18b94-7487-4088-9435-6c312a8727c0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105853 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-bound-sa-token\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105877 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3cb18b94-7487-4088-9435-6c312a8727c0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.133031 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207206 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3cb18b94-7487-4088-9435-6c312a8727c0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207266 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-bound-sa-token\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207289 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3cb18b94-7487-4088-9435-6c312a8727c0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207318 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-registry-tls\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207341 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cb18b94-7487-4088-9435-6c312a8727c0-trusted-ca\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207366 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3cb18b94-7487-4088-9435-6c312a8727c0-registry-certificates\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207404 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9jkk\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-kube-api-access-q9jkk\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207899 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3cb18b94-7487-4088-9435-6c312a8727c0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.209088 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cb18b94-7487-4088-9435-6c312a8727c0-trusted-ca\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.209403 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3cb18b94-7487-4088-9435-6c312a8727c0-registry-certificates\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.214384 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3cb18b94-7487-4088-9435-6c312a8727c0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.216860 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-registry-tls\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.225725 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9jkk\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-kube-api-access-q9jkk\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.231039 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-bound-sa-token\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.305775 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.720793 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p4bcz"] Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.974915 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" event={"ID":"3cb18b94-7487-4088-9435-6c312a8727c0","Type":"ContainerStarted","Data":"23df1dfd8b0ca574380b93362401165bb7788016f363855e168a1a69cd2ff738"} Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.975053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" event={"ID":"3cb18b94-7487-4088-9435-6c312a8727c0","Type":"ContainerStarted","Data":"cb4411c5ac3bd87f28176dc09c2ef533cb18b2299541a493db052cf7bb9ccf20"} Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.977474 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:56 crc kubenswrapper[4816]: I0311 12:03:56.012450 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" podStartSLOduration=2.012435744 podStartE2EDuration="2.012435744s" podCreationTimestamp="2026-03-11 12:03:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:03:56.009751685 +0000 UTC m=+322.601015642" watchObservedRunningTime="2026-03-11 12:03:56.012435744 +0000 UTC m=+322.603699711" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.532785 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9fv28"] Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.534085 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9fv28" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="registry-server" containerID="cri-o://362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98" gracePeriod=30 Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.548769 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jwq6f"] Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.549129 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jwq6f" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="registry-server" containerID="cri-o://33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783" gracePeriod=30 Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.569329 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8gcm4"] Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.569530 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" containerID="cri-o://a866a70345c8481e25e2f58460d24a8a5c95dd9260f5acf809685fe8295ea5eb" gracePeriod=30 Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.576655 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlvrz"] Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.576877 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rlvrz" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="registry-server" containerID="cri-o://37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17" gracePeriod=30 Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.588793 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtm2c"] Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.591625 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jtm2c" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="registry-server" containerID="cri-o://b4ab0057fec3813a8eba57d93db34dca15d692fb5d18b567388c379b9637e53f" gracePeriod=30 Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.604604 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m586v"] Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.605300 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.614234 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m586v"] Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.677901 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e86ee6f4-c5ee-40dd-8e60-977add936dc1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.677975 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs29h\" (UniqueName: \"kubernetes.io/projected/e86ee6f4-c5ee-40dd-8e60-977add936dc1-kube-api-access-fs29h\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.677999 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e86ee6f4-c5ee-40dd-8e60-977add936dc1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.779587 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e86ee6f4-c5ee-40dd-8e60-977add936dc1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.780171 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs29h\" (UniqueName: \"kubernetes.io/projected/e86ee6f4-c5ee-40dd-8e60-977add936dc1-kube-api-access-fs29h\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.780956 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e86ee6f4-c5ee-40dd-8e60-977add936dc1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.782108 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e86ee6f4-c5ee-40dd-8e60-977add936dc1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.785385 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e86ee6f4-c5ee-40dd-8e60-977add936dc1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.800422 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs29h\" (UniqueName: \"kubernetes.io/projected/e86ee6f4-c5ee-40dd-8e60-977add936dc1-kube-api-access-fs29h\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.949551 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.961914 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.996127 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.007187 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.008057 4816 generic.go:334] "Generic (PLEG): container finished" podID="e94af1b5-09ef-433f-91e6-7b352836273d" containerID="37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17" exitCode=0 Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.008115 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlvrz" event={"ID":"e94af1b5-09ef-433f-91e6-7b352836273d","Type":"ContainerDied","Data":"37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.008138 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlvrz" event={"ID":"e94af1b5-09ef-433f-91e6-7b352836273d","Type":"ContainerDied","Data":"a8cafecc50e94d07fe579d21307c54f31a39be731f47fedd9b733a84b5d89387"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.008156 4816 scope.go:117] "RemoveContainer" containerID="37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.008294 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.022089 4816 generic.go:334] "Generic (PLEG): container finished" podID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerID="33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783" exitCode=0 Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.022404 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwq6f" event={"ID":"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e","Type":"ContainerDied","Data":"33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.022436 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwq6f" event={"ID":"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e","Type":"ContainerDied","Data":"499f7962c1697f289517091d9831d7c624088927518036ee83a281ffd5b62905"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.022510 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.025096 4816 generic.go:334] "Generic (PLEG): container finished" podID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerID="362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98" exitCode=0 Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.025329 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.025328 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fv28" event={"ID":"8d6e662d-8633-4e55-baf3-50a2c4d179a1","Type":"ContainerDied","Data":"362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.033051 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fv28" event={"ID":"8d6e662d-8633-4e55-baf3-50a2c4d179a1","Type":"ContainerDied","Data":"e00a61b1b339e0c135f2f8629c96ed94976ec15fddfa98352c7a50768117327d"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.041296 4816 scope.go:117] "RemoveContainer" containerID="e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.045013 4816 generic.go:334] "Generic (PLEG): container finished" podID="ce281163-d6c0-444b-ba55-b488dd77b853" containerID="b4ab0057fec3813a8eba57d93db34dca15d692fb5d18b567388c379b9637e53f" exitCode=0 Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.045111 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtm2c" event={"ID":"ce281163-d6c0-444b-ba55-b488dd77b853","Type":"ContainerDied","Data":"b4ab0057fec3813a8eba57d93db34dca15d692fb5d18b567388c379b9637e53f"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.047225 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerID="a866a70345c8481e25e2f58460d24a8a5c95dd9260f5acf809685fe8295ea5eb" exitCode=0 Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.047262 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" event={"ID":"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6","Type":"ContainerDied","Data":"a866a70345c8481e25e2f58460d24a8a5c95dd9260f5acf809685fe8295ea5eb"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.053822 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.059229 4816 scope.go:117] "RemoveContainer" containerID="d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.079475 4816 scope.go:117] "RemoveContainer" containerID="37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.080337 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17\": container with ID starting with 37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17 not found: ID does not exist" containerID="37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.080378 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17"} err="failed to get container status \"37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17\": rpc error: code = NotFound desc = could not find container \"37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17\": container with ID starting with 37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17 not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.080400 4816 scope.go:117] "RemoveContainer" containerID="e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.080549 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.080816 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941\": container with ID starting with e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941 not found: ID does not exist" containerID="e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.080846 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941"} err="failed to get container status \"e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941\": rpc error: code = NotFound desc = could not find container \"e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941\": container with ID starting with e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941 not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.080860 4816 scope.go:117] "RemoveContainer" containerID="d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.081226 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc\": container with ID starting with d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc not found: ID does not exist" containerID="d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.081283 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc"} err="failed to get container status \"d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc\": rpc error: code = NotFound desc = could not find container \"d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc\": container with ID starting with d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.081315 4816 scope.go:117] "RemoveContainer" containerID="33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086569 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz9fg\" (UniqueName: \"kubernetes.io/projected/8d6e662d-8633-4e55-baf3-50a2c4d179a1-kube-api-access-fz9fg\") pod \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086647 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xchpf\" (UniqueName: \"kubernetes.io/projected/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-kube-api-access-xchpf\") pod \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086673 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lvwq\" (UniqueName: \"kubernetes.io/projected/e94af1b5-09ef-433f-91e6-7b352836273d-kube-api-access-5lvwq\") pod \"e94af1b5-09ef-433f-91e6-7b352836273d\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086710 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-catalog-content\") pod \"e94af1b5-09ef-433f-91e6-7b352836273d\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086746 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-catalog-content\") pod \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086790 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-catalog-content\") pod \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086810 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-utilities\") pod \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086827 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-utilities\") pod \"e94af1b5-09ef-433f-91e6-7b352836273d\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086852 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-utilities\") pod \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.089858 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-utilities" (OuterVolumeSpecName: "utilities") pod "8d6e662d-8633-4e55-baf3-50a2c4d179a1" (UID: "8d6e662d-8633-4e55-baf3-50a2c4d179a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.091211 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-utilities" (OuterVolumeSpecName: "utilities") pod "fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" (UID: "fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.093909 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-utilities" (OuterVolumeSpecName: "utilities") pod "e94af1b5-09ef-433f-91e6-7b352836273d" (UID: "e94af1b5-09ef-433f-91e6-7b352836273d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.094822 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6e662d-8633-4e55-baf3-50a2c4d179a1-kube-api-access-fz9fg" (OuterVolumeSpecName: "kube-api-access-fz9fg") pod "8d6e662d-8633-4e55-baf3-50a2c4d179a1" (UID: "8d6e662d-8633-4e55-baf3-50a2c4d179a1"). InnerVolumeSpecName "kube-api-access-fz9fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.097296 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-kube-api-access-xchpf" (OuterVolumeSpecName: "kube-api-access-xchpf") pod "fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" (UID: "fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e"). InnerVolumeSpecName "kube-api-access-xchpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.118319 4816 scope.go:117] "RemoveContainer" containerID="af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.140688 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e94af1b5-09ef-433f-91e6-7b352836273d-kube-api-access-5lvwq" (OuterVolumeSpecName: "kube-api-access-5lvwq") pod "e94af1b5-09ef-433f-91e6-7b352836273d" (UID: "e94af1b5-09ef-433f-91e6-7b352836273d"). InnerVolumeSpecName "kube-api-access-5lvwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.156833 4816 scope.go:117] "RemoveContainer" containerID="3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.164328 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e94af1b5-09ef-433f-91e6-7b352836273d" (UID: "e94af1b5-09ef-433f-91e6-7b352836273d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.177845 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d6e662d-8633-4e55-baf3-50a2c4d179a1" (UID: "8d6e662d-8633-4e55-baf3-50a2c4d179a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181342 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553844-df99q"] Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181619 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181636 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181653 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181661 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181672 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181680 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181692 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181702 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181711 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181718 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181728 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181735 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181746 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181753 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181761 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181769 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181817 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181826 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181837 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181845 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181856 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181865 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181877 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181885 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181893 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181902 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181912 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181920 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.182033 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.182046 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.182061 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.182074 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.182088 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.182567 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553844-df99q" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.184850 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.185118 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.185273 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.187183 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553844-df99q"] Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.187870 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thth9\" (UniqueName: \"kubernetes.io/projected/ce281163-d6c0-444b-ba55-b488dd77b853-kube-api-access-thth9\") pod \"ce281163-d6c0-444b-ba55-b488dd77b853\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.187938 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-utilities\") pod \"ce281163-d6c0-444b-ba55-b488dd77b853\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.187976 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-operator-metrics\") pod \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188008 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-trusted-ca\") pod \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188049 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-catalog-content\") pod \"ce281163-d6c0-444b-ba55-b488dd77b853\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188073 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86nkz\" (UniqueName: \"kubernetes.io/projected/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-kube-api-access-86nkz\") pod \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188373 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xchpf\" (UniqueName: \"kubernetes.io/projected/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-kube-api-access-xchpf\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188389 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lvwq\" (UniqueName: \"kubernetes.io/projected/e94af1b5-09ef-433f-91e6-7b352836273d-kube-api-access-5lvwq\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188401 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188412 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188422 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188433 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188443 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188454 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz9fg\" (UniqueName: \"kubernetes.io/projected/8d6e662d-8633-4e55-baf3-50a2c4d179a1-kube-api-access-fz9fg\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.191093 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" (UID: "1f8d6149-c5b0-4088-9db5-eeed2eef6ce6"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.191642 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-kube-api-access-86nkz" (OuterVolumeSpecName: "kube-api-access-86nkz") pod "1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" (UID: "1f8d6149-c5b0-4088-9db5-eeed2eef6ce6"). InnerVolumeSpecName "kube-api-access-86nkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.192123 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" (UID: "1f8d6149-c5b0-4088-9db5-eeed2eef6ce6"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.192229 4816 scope.go:117] "RemoveContainer" containerID="33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.192300 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-utilities" (OuterVolumeSpecName: "utilities") pod "ce281163-d6c0-444b-ba55-b488dd77b853" (UID: "ce281163-d6c0-444b-ba55-b488dd77b853"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.199673 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" (UID: "fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.200174 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783\": container with ID starting with 33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783 not found: ID does not exist" containerID="33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.200199 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce281163-d6c0-444b-ba55-b488dd77b853-kube-api-access-thth9" (OuterVolumeSpecName: "kube-api-access-thth9") pod "ce281163-d6c0-444b-ba55-b488dd77b853" (UID: "ce281163-d6c0-444b-ba55-b488dd77b853"). InnerVolumeSpecName "kube-api-access-thth9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.200263 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783"} err="failed to get container status \"33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783\": rpc error: code = NotFound desc = could not find container \"33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783\": container with ID starting with 33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783 not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.200290 4816 scope.go:117] "RemoveContainer" containerID="af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.200876 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247\": container with ID starting with af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247 not found: ID does not exist" containerID="af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.200910 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247"} err="failed to get container status \"af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247\": rpc error: code = NotFound desc = could not find container \"af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247\": container with ID starting with af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247 not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.200939 4816 scope.go:117] "RemoveContainer" containerID="3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.201230 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd\": container with ID starting with 3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd not found: ID does not exist" containerID="3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.201295 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd"} err="failed to get container status \"3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd\": rpc error: code = NotFound desc = could not find container \"3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd\": container with ID starting with 3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.201309 4816 scope.go:117] "RemoveContainer" containerID="362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.218995 4816 scope.go:117] "RemoveContainer" containerID="4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.238529 4816 scope.go:117] "RemoveContainer" containerID="1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.255170 4816 scope.go:117] "RemoveContainer" containerID="362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.255772 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98\": container with ID starting with 362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98 not found: ID does not exist" containerID="362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.255801 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98"} err="failed to get container status \"362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98\": rpc error: code = NotFound desc = could not find container \"362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98\": container with ID starting with 362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98 not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.255820 4816 scope.go:117] "RemoveContainer" containerID="4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.256074 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17\": container with ID starting with 4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17 not found: ID does not exist" containerID="4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.256348 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17"} err="failed to get container status \"4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17\": rpc error: code = NotFound desc = could not find container \"4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17\": container with ID starting with 4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17 not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.256362 4816 scope.go:117] "RemoveContainer" containerID="1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.256610 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb\": container with ID starting with 1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb not found: ID does not exist" containerID="1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.256631 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb"} err="failed to get container status \"1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb\": rpc error: code = NotFound desc = could not find container \"1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb\": container with ID starting with 1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.256643 4816 scope.go:117] "RemoveContainer" containerID="8656da7afe12612a591590b5842a75afd40668d9dd72d7b01fcb55c35787a0e1" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.290162 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67fg6\" (UniqueName: \"kubernetes.io/projected/16125795-8697-470d-bc37-1ab8f6e31af1-kube-api-access-67fg6\") pod \"auto-csr-approver-29553844-df99q\" (UID: \"16125795-8697-470d-bc37-1ab8f6e31af1\") " pod="openshift-infra/auto-csr-approver-29553844-df99q" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.290235 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.290258 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86nkz\" (UniqueName: \"kubernetes.io/projected/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-kube-api-access-86nkz\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.290267 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thth9\" (UniqueName: \"kubernetes.io/projected/ce281163-d6c0-444b-ba55-b488dd77b853-kube-api-access-thth9\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.290279 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.290287 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.290296 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.336306 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlvrz"] Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.336536 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlvrz"] Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.348602 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce281163-d6c0-444b-ba55-b488dd77b853" (UID: "ce281163-d6c0-444b-ba55-b488dd77b853"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.351165 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jwq6f"] Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.354695 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jwq6f"] Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.389927 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9fv28"] Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.391829 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67fg6\" (UniqueName: \"kubernetes.io/projected/16125795-8697-470d-bc37-1ab8f6e31af1-kube-api-access-67fg6\") pod \"auto-csr-approver-29553844-df99q\" (UID: \"16125795-8697-470d-bc37-1ab8f6e31af1\") " pod="openshift-infra/auto-csr-approver-29553844-df99q" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.391896 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.391965 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9fv28"] Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.409999 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67fg6\" (UniqueName: \"kubernetes.io/projected/16125795-8697-470d-bc37-1ab8f6e31af1-kube-api-access-67fg6\") pod \"auto-csr-approver-29553844-df99q\" (UID: \"16125795-8697-470d-bc37-1ab8f6e31af1\") " pod="openshift-infra/auto-csr-approver-29553844-df99q" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.479397 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m586v"] Mar 11 12:04:00 crc kubenswrapper[4816]: W0311 12:04:00.483172 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode86ee6f4_c5ee_40dd_8e60_977add936dc1.slice/crio-ba0e1127a9976371e0a2243c7d72c560e01aaec629769241e8e2aab44dfcf7ee WatchSource:0}: Error finding container ba0e1127a9976371e0a2243c7d72c560e01aaec629769241e8e2aab44dfcf7ee: Status 404 returned error can't find the container with id ba0e1127a9976371e0a2243c7d72c560e01aaec629769241e8e2aab44dfcf7ee Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.502443 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553844-df99q" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.688796 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553844-df99q"] Mar 11 12:04:00 crc kubenswrapper[4816]: W0311 12:04:00.695456 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16125795_8697_470d_bc37_1ab8f6e31af1.slice/crio-ed45a5fe576914ea5f97f6cf2f2568eb5a9c46841daef8cba1a2cce70da3e5e9 WatchSource:0}: Error finding container ed45a5fe576914ea5f97f6cf2f2568eb5a9c46841daef8cba1a2cce70da3e5e9: Status 404 returned error can't find the container with id ed45a5fe576914ea5f97f6cf2f2568eb5a9c46841daef8cba1a2cce70da3e5e9 Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.056913 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtm2c" event={"ID":"ce281163-d6c0-444b-ba55-b488dd77b853","Type":"ContainerDied","Data":"7e759bdc79b60bb3704deb1a705f04eeaaf47ec7245e831ed02fd21393a8ffe0"} Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.056960 4816 scope.go:117] "RemoveContainer" containerID="b4ab0057fec3813a8eba57d93db34dca15d692fb5d18b567388c379b9637e53f" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.056978 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.059629 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.059930 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" event={"ID":"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6","Type":"ContainerDied","Data":"18da590f53c2a68db8ccc3639b30699431b029db82a4def3280157c1b87bba73"} Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.062121 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553844-df99q" event={"ID":"16125795-8697-470d-bc37-1ab8f6e31af1","Type":"ContainerStarted","Data":"ed45a5fe576914ea5f97f6cf2f2568eb5a9c46841daef8cba1a2cce70da3e5e9"} Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.065373 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m586v" event={"ID":"e86ee6f4-c5ee-40dd-8e60-977add936dc1","Type":"ContainerStarted","Data":"027caf2d729990c1d9676988eefc8343957b31187ff5d9808f12331ab5090d22"} Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.065399 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m586v" event={"ID":"e86ee6f4-c5ee-40dd-8e60-977add936dc1","Type":"ContainerStarted","Data":"ba0e1127a9976371e0a2243c7d72c560e01aaec629769241e8e2aab44dfcf7ee"} Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.065592 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.071265 4816 scope.go:117] "RemoveContainer" containerID="a7c62a5bd8897be83a617ee46b4e99b960de1d3c06824d144ebcc6c092953124" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.071414 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.096348 4816 scope.go:117] "RemoveContainer" containerID="3ab1f4b901f51b92d05dc18c4be8f53411d27fe11dfae52c40ff6b519e7e0cea" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.108270 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-m586v" podStartSLOduration=2.108228275 podStartE2EDuration="2.108228275s" podCreationTimestamp="2026-03-11 12:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:04:01.098578984 +0000 UTC m=+327.689842951" watchObservedRunningTime="2026-03-11 12:04:01.108228275 +0000 UTC m=+327.699492242" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.121410 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtm2c"] Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.134236 4816 scope.go:117] "RemoveContainer" containerID="a866a70345c8481e25e2f58460d24a8a5c95dd9260f5acf809685fe8295ea5eb" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.151144 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jtm2c"] Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.157820 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8gcm4"] Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.161463 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8gcm4"] Mar 11 12:04:02 crc kubenswrapper[4816]: I0311 12:04:02.138286 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" path="/var/lib/kubelet/pods/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6/volumes" Mar 11 12:04:02 crc kubenswrapper[4816]: I0311 12:04:02.139051 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" path="/var/lib/kubelet/pods/8d6e662d-8633-4e55-baf3-50a2c4d179a1/volumes" Mar 11 12:04:02 crc kubenswrapper[4816]: I0311 12:04:02.139658 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" path="/var/lib/kubelet/pods/ce281163-d6c0-444b-ba55-b488dd77b853/volumes" Mar 11 12:04:02 crc kubenswrapper[4816]: I0311 12:04:02.140786 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" path="/var/lib/kubelet/pods/e94af1b5-09ef-433f-91e6-7b352836273d/volumes" Mar 11 12:04:02 crc kubenswrapper[4816]: I0311 12:04:02.141412 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" path="/var/lib/kubelet/pods/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e/volumes" Mar 11 12:04:03 crc kubenswrapper[4816]: I0311 12:04:03.081908 4816 generic.go:334] "Generic (PLEG): container finished" podID="16125795-8697-470d-bc37-1ab8f6e31af1" containerID="3d89e5845eb14d7e6c90a432b751164398e20a4fe55d6026ce8f8ec622962660" exitCode=0 Mar 11 12:04:03 crc kubenswrapper[4816]: I0311 12:04:03.081990 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553844-df99q" event={"ID":"16125795-8697-470d-bc37-1ab8f6e31af1","Type":"ContainerDied","Data":"3d89e5845eb14d7e6c90a432b751164398e20a4fe55d6026ce8f8ec622962660"} Mar 11 12:04:04 crc kubenswrapper[4816]: I0311 12:04:04.303311 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553844-df99q" Mar 11 12:04:04 crc kubenswrapper[4816]: I0311 12:04:04.451817 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67fg6\" (UniqueName: \"kubernetes.io/projected/16125795-8697-470d-bc37-1ab8f6e31af1-kube-api-access-67fg6\") pod \"16125795-8697-470d-bc37-1ab8f6e31af1\" (UID: \"16125795-8697-470d-bc37-1ab8f6e31af1\") " Mar 11 12:04:04 crc kubenswrapper[4816]: I0311 12:04:04.457469 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16125795-8697-470d-bc37-1ab8f6e31af1-kube-api-access-67fg6" (OuterVolumeSpecName: "kube-api-access-67fg6") pod "16125795-8697-470d-bc37-1ab8f6e31af1" (UID: "16125795-8697-470d-bc37-1ab8f6e31af1"). InnerVolumeSpecName "kube-api-access-67fg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:04 crc kubenswrapper[4816]: I0311 12:04:04.553428 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67fg6\" (UniqueName: \"kubernetes.io/projected/16125795-8697-470d-bc37-1ab8f6e31af1-kube-api-access-67fg6\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:05 crc kubenswrapper[4816]: I0311 12:04:05.092693 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553844-df99q" event={"ID":"16125795-8697-470d-bc37-1ab8f6e31af1","Type":"ContainerDied","Data":"ed45a5fe576914ea5f97f6cf2f2568eb5a9c46841daef8cba1a2cce70da3e5e9"} Mar 11 12:04:05 crc kubenswrapper[4816]: I0311 12:04:05.092748 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed45a5fe576914ea5f97f6cf2f2568eb5a9c46841daef8cba1a2cce70da3e5e9" Mar 11 12:04:05 crc kubenswrapper[4816]: I0311 12:04:05.093074 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553844-df99q" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.170361 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qb5pd"] Mar 11 12:04:12 crc kubenswrapper[4816]: E0311 12:04:12.171205 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16125795-8697-470d-bc37-1ab8f6e31af1" containerName="oc" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.171221 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="16125795-8697-470d-bc37-1ab8f6e31af1" containerName="oc" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.171354 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.171370 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="16125795-8697-470d-bc37-1ab8f6e31af1" containerName="oc" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.172368 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.174793 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.184105 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qb5pd"] Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.256994 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-utilities\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.257054 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-catalog-content\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.257098 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f44xj\" (UniqueName: \"kubernetes.io/projected/963d27c0-f203-4997-aa60-ac73d2a54cc0-kube-api-access-f44xj\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.359815 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-utilities\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.359881 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-catalog-content\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.359934 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f44xj\" (UniqueName: \"kubernetes.io/projected/963d27c0-f203-4997-aa60-ac73d2a54cc0-kube-api-access-f44xj\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.361012 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-utilities\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.361405 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-catalog-content\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.375840 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wlx2d"] Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.381329 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.385644 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.387454 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f44xj\" (UniqueName: \"kubernetes.io/projected/963d27c0-f203-4997-aa60-ac73d2a54cc0-kube-api-access-f44xj\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.393241 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wlx2d"] Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.460998 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-catalog-content\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.461073 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-utilities\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.461170 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfctf\" (UniqueName: \"kubernetes.io/projected/d456b988-0480-49fc-9667-03c56b871abe-kube-api-access-kfctf\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.503238 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.562039 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-catalog-content\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.562510 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-utilities\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.562664 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfctf\" (UniqueName: \"kubernetes.io/projected/d456b988-0480-49fc-9667-03c56b871abe-kube-api-access-kfctf\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.562576 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-catalog-content\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.562918 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-utilities\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.588178 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfctf\" (UniqueName: \"kubernetes.io/projected/d456b988-0480-49fc-9667-03c56b871abe-kube-api-access-kfctf\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.718391 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.898852 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qb5pd"] Mar 11 12:04:12 crc kubenswrapper[4816]: W0311 12:04:12.900995 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod963d27c0_f203_4997_aa60_ac73d2a54cc0.slice/crio-5d89076f0fdd1a586d2d1d9d12f836502df9b389006d64897e25f5fabea5fa22 WatchSource:0}: Error finding container 5d89076f0fdd1a586d2d1d9d12f836502df9b389006d64897e25f5fabea5fa22: Status 404 returned error can't find the container with id 5d89076f0fdd1a586d2d1d9d12f836502df9b389006d64897e25f5fabea5fa22 Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.901315 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wlx2d"] Mar 11 12:04:12 crc kubenswrapper[4816]: W0311 12:04:12.907815 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd456b988_0480_49fc_9667_03c56b871abe.slice/crio-e45e20ab411467d564ca8cae5d6389cdfcd6bd45b4eceaf7b963fe5e1fca9258 WatchSource:0}: Error finding container e45e20ab411467d564ca8cae5d6389cdfcd6bd45b4eceaf7b963fe5e1fca9258: Status 404 returned error can't find the container with id e45e20ab411467d564ca8cae5d6389cdfcd6bd45b4eceaf7b963fe5e1fca9258 Mar 11 12:04:13 crc kubenswrapper[4816]: I0311 12:04:13.136971 4816 generic.go:334] "Generic (PLEG): container finished" podID="d456b988-0480-49fc-9667-03c56b871abe" containerID="a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf" exitCode=0 Mar 11 12:04:13 crc kubenswrapper[4816]: I0311 12:04:13.137045 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlx2d" event={"ID":"d456b988-0480-49fc-9667-03c56b871abe","Type":"ContainerDied","Data":"a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf"} Mar 11 12:04:13 crc kubenswrapper[4816]: I0311 12:04:13.137122 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlx2d" event={"ID":"d456b988-0480-49fc-9667-03c56b871abe","Type":"ContainerStarted","Data":"e45e20ab411467d564ca8cae5d6389cdfcd6bd45b4eceaf7b963fe5e1fca9258"} Mar 11 12:04:13 crc kubenswrapper[4816]: I0311 12:04:13.139725 4816 generic.go:334] "Generic (PLEG): container finished" podID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerID="6e58f19a27ae3010beb47e8be328d7c7ee7c8f14b5f34d2213706b6f25097290" exitCode=0 Mar 11 12:04:13 crc kubenswrapper[4816]: I0311 12:04:13.139773 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb5pd" event={"ID":"963d27c0-f203-4997-aa60-ac73d2a54cc0","Type":"ContainerDied","Data":"6e58f19a27ae3010beb47e8be328d7c7ee7c8f14b5f34d2213706b6f25097290"} Mar 11 12:04:13 crc kubenswrapper[4816]: I0311 12:04:13.139803 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb5pd" event={"ID":"963d27c0-f203-4997-aa60-ac73d2a54cc0","Type":"ContainerStarted","Data":"5d89076f0fdd1a586d2d1d9d12f836502df9b389006d64897e25f5fabea5fa22"} Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.147431 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb5pd" event={"ID":"963d27c0-f203-4997-aa60-ac73d2a54cc0","Type":"ContainerStarted","Data":"8c235b1052133359e398ac00a2eee490f7a085338a2901f71eac6e872bda6cbf"} Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.151198 4816 generic.go:334] "Generic (PLEG): container finished" podID="d456b988-0480-49fc-9667-03c56b871abe" containerID="6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60" exitCode=0 Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.151319 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlx2d" event={"ID":"d456b988-0480-49fc-9667-03c56b871abe","Type":"ContainerDied","Data":"6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60"} Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.772126 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9f9jq"] Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.775455 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.779350 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.785308 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9f9jq"] Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.898633 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/991327ed-0ad5-4161-a218-598e50bbafe9-utilities\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.898725 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hkbz\" (UniqueName: \"kubernetes.io/projected/991327ed-0ad5-4161-a218-598e50bbafe9-kube-api-access-6hkbz\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.898870 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/991327ed-0ad5-4161-a218-598e50bbafe9-catalog-content\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.970527 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4czr8"] Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.972077 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.976554 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.980007 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4czr8"] Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.000637 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/991327ed-0ad5-4161-a218-598e50bbafe9-catalog-content\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.000743 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/991327ed-0ad5-4161-a218-598e50bbafe9-utilities\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.000789 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hkbz\" (UniqueName: \"kubernetes.io/projected/991327ed-0ad5-4161-a218-598e50bbafe9-kube-api-access-6hkbz\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.001287 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/991327ed-0ad5-4161-a218-598e50bbafe9-catalog-content\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.001410 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/991327ed-0ad5-4161-a218-598e50bbafe9-utilities\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.036806 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hkbz\" (UniqueName: \"kubernetes.io/projected/991327ed-0ad5-4161-a218-598e50bbafe9-kube-api-access-6hkbz\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.094659 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.102023 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fpm7\" (UniqueName: \"kubernetes.io/projected/08bf2596-9393-42d3-9b76-461be3ee0c22-kube-api-access-7fpm7\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.102115 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08bf2596-9393-42d3-9b76-461be3ee0c22-utilities\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.102134 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08bf2596-9393-42d3-9b76-461be3ee0c22-catalog-content\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.165541 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlx2d" event={"ID":"d456b988-0480-49fc-9667-03c56b871abe","Type":"ContainerStarted","Data":"034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f"} Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.168825 4816 generic.go:334] "Generic (PLEG): container finished" podID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerID="8c235b1052133359e398ac00a2eee490f7a085338a2901f71eac6e872bda6cbf" exitCode=0 Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.168881 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb5pd" event={"ID":"963d27c0-f203-4997-aa60-ac73d2a54cc0","Type":"ContainerDied","Data":"8c235b1052133359e398ac00a2eee490f7a085338a2901f71eac6e872bda6cbf"} Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.185178 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wlx2d" podStartSLOduration=1.6543980729999999 podStartE2EDuration="3.185161604s" podCreationTimestamp="2026-03-11 12:04:12 +0000 UTC" firstStartedPulling="2026-03-11 12:04:13.138604745 +0000 UTC m=+339.729868712" lastFinishedPulling="2026-03-11 12:04:14.669368276 +0000 UTC m=+341.260632243" observedRunningTime="2026-03-11 12:04:15.184749052 +0000 UTC m=+341.776013019" watchObservedRunningTime="2026-03-11 12:04:15.185161604 +0000 UTC m=+341.776425571" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.207355 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08bf2596-9393-42d3-9b76-461be3ee0c22-utilities\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.207392 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08bf2596-9393-42d3-9b76-461be3ee0c22-catalog-content\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.207430 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fpm7\" (UniqueName: \"kubernetes.io/projected/08bf2596-9393-42d3-9b76-461be3ee0c22-kube-api-access-7fpm7\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.208206 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08bf2596-9393-42d3-9b76-461be3ee0c22-utilities\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.208434 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08bf2596-9393-42d3-9b76-461be3ee0c22-catalog-content\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.246434 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fpm7\" (UniqueName: \"kubernetes.io/projected/08bf2596-9393-42d3-9b76-461be3ee0c22-kube-api-access-7fpm7\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.315820 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.347303 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.372692 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p426k"] Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.563957 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4czr8"] Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.596507 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9f9jq"] Mar 11 12:04:15 crc kubenswrapper[4816]: W0311 12:04:15.602775 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod991327ed_0ad5_4161_a218_598e50bbafe9.slice/crio-d0036afadbd6ee40c4d716a9f9bbe951341dd42bc2d4900d369407d3de2aaca0 WatchSource:0}: Error finding container d0036afadbd6ee40c4d716a9f9bbe951341dd42bc2d4900d369407d3de2aaca0: Status 404 returned error can't find the container with id d0036afadbd6ee40c4d716a9f9bbe951341dd42bc2d4900d369407d3de2aaca0 Mar 11 12:04:16 crc kubenswrapper[4816]: I0311 12:04:16.179353 4816 generic.go:334] "Generic (PLEG): container finished" podID="991327ed-0ad5-4161-a218-598e50bbafe9" containerID="276b5a20d3bef5cecb98f046cf1ab8c761311214dde470e9bf10f1198b21e2c2" exitCode=0 Mar 11 12:04:16 crc kubenswrapper[4816]: I0311 12:04:16.179637 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9f9jq" event={"ID":"991327ed-0ad5-4161-a218-598e50bbafe9","Type":"ContainerDied","Data":"276b5a20d3bef5cecb98f046cf1ab8c761311214dde470e9bf10f1198b21e2c2"} Mar 11 12:04:16 crc kubenswrapper[4816]: I0311 12:04:16.179804 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9f9jq" event={"ID":"991327ed-0ad5-4161-a218-598e50bbafe9","Type":"ContainerStarted","Data":"d0036afadbd6ee40c4d716a9f9bbe951341dd42bc2d4900d369407d3de2aaca0"} Mar 11 12:04:16 crc kubenswrapper[4816]: I0311 12:04:16.184024 4816 generic.go:334] "Generic (PLEG): container finished" podID="08bf2596-9393-42d3-9b76-461be3ee0c22" containerID="f6e708c07beedc927d88fe860160fe134356afe7b3a445a39a8afeb3d7fe107a" exitCode=0 Mar 11 12:04:16 crc kubenswrapper[4816]: I0311 12:04:16.184244 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czr8" event={"ID":"08bf2596-9393-42d3-9b76-461be3ee0c22","Type":"ContainerDied","Data":"f6e708c07beedc927d88fe860160fe134356afe7b3a445a39a8afeb3d7fe107a"} Mar 11 12:04:16 crc kubenswrapper[4816]: I0311 12:04:16.184326 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czr8" event={"ID":"08bf2596-9393-42d3-9b76-461be3ee0c22","Type":"ContainerStarted","Data":"0bfb7b65242e9132f4eeb09297c51306c673c57d0632f40983551ce70feb2ca5"} Mar 11 12:04:17 crc kubenswrapper[4816]: I0311 12:04:17.191071 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb5pd" event={"ID":"963d27c0-f203-4997-aa60-ac73d2a54cc0","Type":"ContainerStarted","Data":"df036ebc1022629bd7df15b57ae8610b239015cc838a88645d459c84c864e336"} Mar 11 12:04:17 crc kubenswrapper[4816]: I0311 12:04:17.193101 4816 generic.go:334] "Generic (PLEG): container finished" podID="991327ed-0ad5-4161-a218-598e50bbafe9" containerID="3a5a38f455f732668a196f091e014f4ac2c18dd8d3f055e916189b87f4ab5984" exitCode=0 Mar 11 12:04:17 crc kubenswrapper[4816]: I0311 12:04:17.193153 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9f9jq" event={"ID":"991327ed-0ad5-4161-a218-598e50bbafe9","Type":"ContainerDied","Data":"3a5a38f455f732668a196f091e014f4ac2c18dd8d3f055e916189b87f4ab5984"} Mar 11 12:04:17 crc kubenswrapper[4816]: I0311 12:04:17.209268 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qb5pd" podStartSLOduration=2.228415326 podStartE2EDuration="5.209235396s" podCreationTimestamp="2026-03-11 12:04:12 +0000 UTC" firstStartedPulling="2026-03-11 12:04:13.141173712 +0000 UTC m=+339.732437679" lastFinishedPulling="2026-03-11 12:04:16.121993782 +0000 UTC m=+342.713257749" observedRunningTime="2026-03-11 12:04:17.206313797 +0000 UTC m=+343.797577764" watchObservedRunningTime="2026-03-11 12:04:17.209235396 +0000 UTC m=+343.800499363" Mar 11 12:04:18 crc kubenswrapper[4816]: I0311 12:04:18.200475 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9f9jq" event={"ID":"991327ed-0ad5-4161-a218-598e50bbafe9","Type":"ContainerStarted","Data":"1dcfedce7d5dd95fbbcd4585d201214260e6231ba9c368710f19185b969935ea"} Mar 11 12:04:18 crc kubenswrapper[4816]: I0311 12:04:18.202069 4816 generic.go:334] "Generic (PLEG): container finished" podID="08bf2596-9393-42d3-9b76-461be3ee0c22" containerID="779fac97e6262ed086b95c9507f877a519a9ebc4041e2dc8f6025e304e1b6964" exitCode=0 Mar 11 12:04:18 crc kubenswrapper[4816]: I0311 12:04:18.202125 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czr8" event={"ID":"08bf2596-9393-42d3-9b76-461be3ee0c22","Type":"ContainerDied","Data":"779fac97e6262ed086b95c9507f877a519a9ebc4041e2dc8f6025e304e1b6964"} Mar 11 12:04:18 crc kubenswrapper[4816]: I0311 12:04:18.217492 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9f9jq" podStartSLOduration=2.798062164 podStartE2EDuration="4.217473297s" podCreationTimestamp="2026-03-11 12:04:14 +0000 UTC" firstStartedPulling="2026-03-11 12:04:16.181394874 +0000 UTC m=+342.772658851" lastFinishedPulling="2026-03-11 12:04:17.600806017 +0000 UTC m=+344.192069984" observedRunningTime="2026-03-11 12:04:18.216146767 +0000 UTC m=+344.807410734" watchObservedRunningTime="2026-03-11 12:04:18.217473297 +0000 UTC m=+344.808737264" Mar 11 12:04:19 crc kubenswrapper[4816]: I0311 12:04:19.209361 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czr8" event={"ID":"08bf2596-9393-42d3-9b76-461be3ee0c22","Type":"ContainerStarted","Data":"d99aa505afb58c2dbaf5d0203d020f8764555084a9ed6334aa20ae5cb8b3b88c"} Mar 11 12:04:19 crc kubenswrapper[4816]: I0311 12:04:19.228191 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4czr8" podStartSLOduration=2.7697303399999997 podStartE2EDuration="5.228177503s" podCreationTimestamp="2026-03-11 12:04:14 +0000 UTC" firstStartedPulling="2026-03-11 12:04:16.18624547 +0000 UTC m=+342.777509477" lastFinishedPulling="2026-03-11 12:04:18.644692683 +0000 UTC m=+345.235956640" observedRunningTime="2026-03-11 12:04:19.227014018 +0000 UTC m=+345.818277985" watchObservedRunningTime="2026-03-11 12:04:19.228177503 +0000 UTC m=+345.819441470" Mar 11 12:04:22 crc kubenswrapper[4816]: I0311 12:04:22.503636 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:22 crc kubenswrapper[4816]: I0311 12:04:22.505407 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:22 crc kubenswrapper[4816]: I0311 12:04:22.565696 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:22 crc kubenswrapper[4816]: I0311 12:04:22.718636 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:22 crc kubenswrapper[4816]: I0311 12:04:22.718694 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:22 crc kubenswrapper[4816]: I0311 12:04:22.759321 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:23 crc kubenswrapper[4816]: I0311 12:04:23.273222 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:23 crc kubenswrapper[4816]: I0311 12:04:23.275261 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:25 crc kubenswrapper[4816]: I0311 12:04:25.095438 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:25 crc kubenswrapper[4816]: I0311 12:04:25.095710 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:25 crc kubenswrapper[4816]: I0311 12:04:25.141729 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:25 crc kubenswrapper[4816]: I0311 12:04:25.286235 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:25 crc kubenswrapper[4816]: I0311 12:04:25.348257 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:25 crc kubenswrapper[4816]: I0311 12:04:25.348344 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:26 crc kubenswrapper[4816]: I0311 12:04:26.392641 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4czr8" podUID="08bf2596-9393-42d3-9b76-461be3ee0c22" containerName="registry-server" probeResult="failure" output=< Mar 11 12:04:26 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:04:26 crc kubenswrapper[4816]: > Mar 11 12:04:35 crc kubenswrapper[4816]: I0311 12:04:35.382506 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:35 crc kubenswrapper[4816]: I0311 12:04:35.432483 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.408853 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" podUID="9a7e3709-d407-4679-add6-375a835421be" containerName="registry" containerID="cri-o://29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf" gracePeriod=30 Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.735593 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.866033 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9a7e3709-d407-4679-add6-375a835421be-ca-trust-extracted\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.866076 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-trusted-ca\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.866101 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9a7e3709-d407-4679-add6-375a835421be-installation-pull-secrets\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.866286 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.866344 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-registry-tls\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.866386 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-bound-sa-token\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.866446 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qwwd\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-kube-api-access-6qwwd\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.867088 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-registry-certificates\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.868112 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.868398 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.873571 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.874137 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-kube-api-access-6qwwd" (OuterVolumeSpecName: "kube-api-access-6qwwd") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "kube-api-access-6qwwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.875798 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.877331 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a7e3709-d407-4679-add6-375a835421be-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.877660 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.882283 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a7e3709-d407-4679-add6-375a835421be-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.968844 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.968872 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qwwd\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-kube-api-access-6qwwd\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.968882 4816 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.968893 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.968904 4816 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9a7e3709-d407-4679-add6-375a835421be-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.968915 4816 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9a7e3709-d407-4679-add6-375a835421be-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.968923 4816 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.338920 4816 generic.go:334] "Generic (PLEG): container finished" podID="9a7e3709-d407-4679-add6-375a835421be" containerID="29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf" exitCode=0 Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.338964 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.338976 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" event={"ID":"9a7e3709-d407-4679-add6-375a835421be","Type":"ContainerDied","Data":"29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf"} Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.339027 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" event={"ID":"9a7e3709-d407-4679-add6-375a835421be","Type":"ContainerDied","Data":"ec23157cec86a7144fad1cf7ce6f1de12230714b1e857a2199a9972f099db0a1"} Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.339049 4816 scope.go:117] "RemoveContainer" containerID="29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf" Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.355711 4816 scope.go:117] "RemoveContainer" containerID="29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf" Mar 11 12:04:41 crc kubenswrapper[4816]: E0311 12:04:41.358096 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf\": container with ID starting with 29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf not found: ID does not exist" containerID="29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf" Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.358148 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf"} err="failed to get container status \"29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf\": rpc error: code = NotFound desc = could not find container \"29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf\": container with ID starting with 29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf not found: ID does not exist" Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.364427 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p426k"] Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.367711 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p426k"] Mar 11 12:04:42 crc kubenswrapper[4816]: I0311 12:04:42.137214 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a7e3709-d407-4679-add6-375a835421be" path="/var/lib/kubelet/pods/9a7e3709-d407-4679-add6-375a835421be/volumes" Mar 11 12:05:39 crc kubenswrapper[4816]: I0311 12:05:39.514838 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:05:39 crc kubenswrapper[4816]: I0311 12:05:39.515424 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.146540 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553846-v5vlq"] Mar 11 12:06:00 crc kubenswrapper[4816]: E0311 12:06:00.147430 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7e3709-d407-4679-add6-375a835421be" containerName="registry" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.147454 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7e3709-d407-4679-add6-375a835421be" containerName="registry" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.147760 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a7e3709-d407-4679-add6-375a835421be" containerName="registry" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.148502 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.149912 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553846-v5vlq"] Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.151623 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.151668 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.152014 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.214768 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj5vh\" (UniqueName: \"kubernetes.io/projected/8860b8d2-719a-4930-9df3-d0bc14d8de19-kube-api-access-dj5vh\") pod \"auto-csr-approver-29553846-v5vlq\" (UID: \"8860b8d2-719a-4930-9df3-d0bc14d8de19\") " pod="openshift-infra/auto-csr-approver-29553846-v5vlq" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.316170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj5vh\" (UniqueName: \"kubernetes.io/projected/8860b8d2-719a-4930-9df3-d0bc14d8de19-kube-api-access-dj5vh\") pod \"auto-csr-approver-29553846-v5vlq\" (UID: \"8860b8d2-719a-4930-9df3-d0bc14d8de19\") " pod="openshift-infra/auto-csr-approver-29553846-v5vlq" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.335817 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj5vh\" (UniqueName: \"kubernetes.io/projected/8860b8d2-719a-4930-9df3-d0bc14d8de19-kube-api-access-dj5vh\") pod \"auto-csr-approver-29553846-v5vlq\" (UID: \"8860b8d2-719a-4930-9df3-d0bc14d8de19\") " pod="openshift-infra/auto-csr-approver-29553846-v5vlq" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.468538 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.638937 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553846-v5vlq"] Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.647108 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:06:01 crc kubenswrapper[4816]: I0311 12:06:01.011111 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" event={"ID":"8860b8d2-719a-4930-9df3-d0bc14d8de19","Type":"ContainerStarted","Data":"09371881040e37ad815ab74ee53fb47977ad2f8c78f64b5e3d2d140a71ec6726"} Mar 11 12:06:02 crc kubenswrapper[4816]: I0311 12:06:02.017138 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" event={"ID":"8860b8d2-719a-4930-9df3-d0bc14d8de19","Type":"ContainerStarted","Data":"587884e89c7feed672cb66139bc979bafcbc72560d3687797e97ca922f238ebb"} Mar 11 12:06:02 crc kubenswrapper[4816]: I0311 12:06:02.035654 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" podStartSLOduration=1.070459283 podStartE2EDuration="2.035622694s" podCreationTimestamp="2026-03-11 12:06:00 +0000 UTC" firstStartedPulling="2026-03-11 12:06:00.646888137 +0000 UTC m=+447.238152094" lastFinishedPulling="2026-03-11 12:06:01.612051508 +0000 UTC m=+448.203315505" observedRunningTime="2026-03-11 12:06:02.030331885 +0000 UTC m=+448.621595852" watchObservedRunningTime="2026-03-11 12:06:02.035622694 +0000 UTC m=+448.626886701" Mar 11 12:06:03 crc kubenswrapper[4816]: I0311 12:06:03.027765 4816 generic.go:334] "Generic (PLEG): container finished" podID="8860b8d2-719a-4930-9df3-d0bc14d8de19" containerID="587884e89c7feed672cb66139bc979bafcbc72560d3687797e97ca922f238ebb" exitCode=0 Mar 11 12:06:03 crc kubenswrapper[4816]: I0311 12:06:03.029089 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" event={"ID":"8860b8d2-719a-4930-9df3-d0bc14d8de19","Type":"ContainerDied","Data":"587884e89c7feed672cb66139bc979bafcbc72560d3687797e97ca922f238ebb"} Mar 11 12:06:04 crc kubenswrapper[4816]: I0311 12:06:04.292811 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" Mar 11 12:06:04 crc kubenswrapper[4816]: I0311 12:06:04.371690 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj5vh\" (UniqueName: \"kubernetes.io/projected/8860b8d2-719a-4930-9df3-d0bc14d8de19-kube-api-access-dj5vh\") pod \"8860b8d2-719a-4930-9df3-d0bc14d8de19\" (UID: \"8860b8d2-719a-4930-9df3-d0bc14d8de19\") " Mar 11 12:06:04 crc kubenswrapper[4816]: I0311 12:06:04.383478 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8860b8d2-719a-4930-9df3-d0bc14d8de19-kube-api-access-dj5vh" (OuterVolumeSpecName: "kube-api-access-dj5vh") pod "8860b8d2-719a-4930-9df3-d0bc14d8de19" (UID: "8860b8d2-719a-4930-9df3-d0bc14d8de19"). InnerVolumeSpecName "kube-api-access-dj5vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:06:04 crc kubenswrapper[4816]: I0311 12:06:04.474569 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj5vh\" (UniqueName: \"kubernetes.io/projected/8860b8d2-719a-4930-9df3-d0bc14d8de19-kube-api-access-dj5vh\") on node \"crc\" DevicePath \"\"" Mar 11 12:06:05 crc kubenswrapper[4816]: I0311 12:06:05.048935 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" event={"ID":"8860b8d2-719a-4930-9df3-d0bc14d8de19","Type":"ContainerDied","Data":"09371881040e37ad815ab74ee53fb47977ad2f8c78f64b5e3d2d140a71ec6726"} Mar 11 12:06:05 crc kubenswrapper[4816]: I0311 12:06:05.048986 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09371881040e37ad815ab74ee53fb47977ad2f8c78f64b5e3d2d140a71ec6726" Mar 11 12:06:05 crc kubenswrapper[4816]: I0311 12:06:05.049045 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" Mar 11 12:06:09 crc kubenswrapper[4816]: I0311 12:06:09.515308 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:06:09 crc kubenswrapper[4816]: I0311 12:06:09.515647 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.515203 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.515993 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.516074 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.517100 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a1d4bcf565d8188182640fd6dfae19dbf3e118e4747e8a92039a28e6b5c3c95c"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.517235 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://a1d4bcf565d8188182640fd6dfae19dbf3e118e4747e8a92039a28e6b5c3c95c" gracePeriod=600 Mar 11 12:06:39 crc kubenswrapper[4816]: E0311 12:06:39.633077 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fdff21c_644f_4443_a268_f98c91ea120a.slice/crio-a1d4bcf565d8188182640fd6dfae19dbf3e118e4747e8a92039a28e6b5c3c95c.scope\": RecentStats: unable to find data in memory cache]" Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.687807 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="a1d4bcf565d8188182640fd6dfae19dbf3e118e4747e8a92039a28e6b5c3c95c" exitCode=0 Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.687872 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"a1d4bcf565d8188182640fd6dfae19dbf3e118e4747e8a92039a28e6b5c3c95c"} Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.688050 4816 scope.go:117] "RemoveContainer" containerID="fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2" Mar 11 12:06:40 crc kubenswrapper[4816]: I0311 12:06:40.694155 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"233fffa5de6ee1e762a8824b32dec71fe3b7403332cc2d914d3770d768c1fbca"} Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.137580 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553848-blbgg"] Mar 11 12:08:00 crc kubenswrapper[4816]: E0311 12:08:00.138366 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8860b8d2-719a-4930-9df3-d0bc14d8de19" containerName="oc" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.138381 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8860b8d2-719a-4930-9df3-d0bc14d8de19" containerName="oc" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.138517 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8860b8d2-719a-4930-9df3-d0bc14d8de19" containerName="oc" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.138953 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553848-blbgg" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.140969 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.141943 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.142300 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.142868 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553848-blbgg"] Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.247787 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5njsx\" (UniqueName: \"kubernetes.io/projected/cf7a354c-a3ec-44fe-8e27-028abd12d7d9-kube-api-access-5njsx\") pod \"auto-csr-approver-29553848-blbgg\" (UID: \"cf7a354c-a3ec-44fe-8e27-028abd12d7d9\") " pod="openshift-infra/auto-csr-approver-29553848-blbgg" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.349402 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5njsx\" (UniqueName: \"kubernetes.io/projected/cf7a354c-a3ec-44fe-8e27-028abd12d7d9-kube-api-access-5njsx\") pod \"auto-csr-approver-29553848-blbgg\" (UID: \"cf7a354c-a3ec-44fe-8e27-028abd12d7d9\") " pod="openshift-infra/auto-csr-approver-29553848-blbgg" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.367041 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5njsx\" (UniqueName: \"kubernetes.io/projected/cf7a354c-a3ec-44fe-8e27-028abd12d7d9-kube-api-access-5njsx\") pod \"auto-csr-approver-29553848-blbgg\" (UID: \"cf7a354c-a3ec-44fe-8e27-028abd12d7d9\") " pod="openshift-infra/auto-csr-approver-29553848-blbgg" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.457478 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553848-blbgg" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.627791 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553848-blbgg"] Mar 11 12:08:01 crc kubenswrapper[4816]: I0311 12:08:01.189175 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553848-blbgg" event={"ID":"cf7a354c-a3ec-44fe-8e27-028abd12d7d9","Type":"ContainerStarted","Data":"464cd99aebb8d1992d92bdb9f36912fa5157a5dd9a45577e3df7d1d25b868228"} Mar 11 12:08:02 crc kubenswrapper[4816]: I0311 12:08:02.210012 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553848-blbgg" event={"ID":"cf7a354c-a3ec-44fe-8e27-028abd12d7d9","Type":"ContainerStarted","Data":"ce753e08f0c29759ce4abeca1c2ba4ffc8217be9eee018a375b073d4682d5231"} Mar 11 12:08:02 crc kubenswrapper[4816]: I0311 12:08:02.227056 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553848-blbgg" podStartSLOduration=0.966789945 podStartE2EDuration="2.227037369s" podCreationTimestamp="2026-03-11 12:08:00 +0000 UTC" firstStartedPulling="2026-03-11 12:08:00.63686043 +0000 UTC m=+567.228124397" lastFinishedPulling="2026-03-11 12:08:01.897107854 +0000 UTC m=+568.488371821" observedRunningTime="2026-03-11 12:08:02.223103277 +0000 UTC m=+568.814367244" watchObservedRunningTime="2026-03-11 12:08:02.227037369 +0000 UTC m=+568.818301336" Mar 11 12:08:03 crc kubenswrapper[4816]: I0311 12:08:03.216426 4816 generic.go:334] "Generic (PLEG): container finished" podID="cf7a354c-a3ec-44fe-8e27-028abd12d7d9" containerID="ce753e08f0c29759ce4abeca1c2ba4ffc8217be9eee018a375b073d4682d5231" exitCode=0 Mar 11 12:08:03 crc kubenswrapper[4816]: I0311 12:08:03.216477 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553848-blbgg" event={"ID":"cf7a354c-a3ec-44fe-8e27-028abd12d7d9","Type":"ContainerDied","Data":"ce753e08f0c29759ce4abeca1c2ba4ffc8217be9eee018a375b073d4682d5231"} Mar 11 12:08:04 crc kubenswrapper[4816]: I0311 12:08:04.516607 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553848-blbgg" Mar 11 12:08:04 crc kubenswrapper[4816]: I0311 12:08:04.708135 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5njsx\" (UniqueName: \"kubernetes.io/projected/cf7a354c-a3ec-44fe-8e27-028abd12d7d9-kube-api-access-5njsx\") pod \"cf7a354c-a3ec-44fe-8e27-028abd12d7d9\" (UID: \"cf7a354c-a3ec-44fe-8e27-028abd12d7d9\") " Mar 11 12:08:04 crc kubenswrapper[4816]: I0311 12:08:04.717799 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7a354c-a3ec-44fe-8e27-028abd12d7d9-kube-api-access-5njsx" (OuterVolumeSpecName: "kube-api-access-5njsx") pod "cf7a354c-a3ec-44fe-8e27-028abd12d7d9" (UID: "cf7a354c-a3ec-44fe-8e27-028abd12d7d9"). InnerVolumeSpecName "kube-api-access-5njsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:08:04 crc kubenswrapper[4816]: I0311 12:08:04.809815 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5njsx\" (UniqueName: \"kubernetes.io/projected/cf7a354c-a3ec-44fe-8e27-028abd12d7d9-kube-api-access-5njsx\") on node \"crc\" DevicePath \"\"" Mar 11 12:08:05 crc kubenswrapper[4816]: I0311 12:08:05.229053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553848-blbgg" event={"ID":"cf7a354c-a3ec-44fe-8e27-028abd12d7d9","Type":"ContainerDied","Data":"464cd99aebb8d1992d92bdb9f36912fa5157a5dd9a45577e3df7d1d25b868228"} Mar 11 12:08:05 crc kubenswrapper[4816]: I0311 12:08:05.229087 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="464cd99aebb8d1992d92bdb9f36912fa5157a5dd9a45577e3df7d1d25b868228" Mar 11 12:08:05 crc kubenswrapper[4816]: I0311 12:08:05.229091 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553848-blbgg" Mar 11 12:08:05 crc kubenswrapper[4816]: I0311 12:08:05.291732 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553842-6xkxh"] Mar 11 12:08:05 crc kubenswrapper[4816]: I0311 12:08:05.296421 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553842-6xkxh"] Mar 11 12:08:06 crc kubenswrapper[4816]: I0311 12:08:06.137284 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba5c6602-69d6-46be-a23b-fb4d6290a974" path="/var/lib/kubelet/pods/ba5c6602-69d6-46be-a23b-fb4d6290a974/volumes" Mar 11 12:08:39 crc kubenswrapper[4816]: I0311 12:08:39.516027 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:08:39 crc kubenswrapper[4816]: I0311 12:08:39.518595 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:09:09 crc kubenswrapper[4816]: I0311 12:09:09.515239 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:09:09 crc kubenswrapper[4816]: I0311 12:09:09.516514 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:09:34 crc kubenswrapper[4816]: I0311 12:09:34.426515 4816 scope.go:117] "RemoveContainer" containerID="2cfac82b0530dfec9409f269e3ee40d7a556b84403bec8f94f82329b0208a810" Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.515445 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.515908 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.515976 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.517884 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"233fffa5de6ee1e762a8824b32dec71fe3b7403332cc2d914d3770d768c1fbca"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.517999 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://233fffa5de6ee1e762a8824b32dec71fe3b7403332cc2d914d3770d768c1fbca" gracePeriod=600 Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.821621 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="233fffa5de6ee1e762a8824b32dec71fe3b7403332cc2d914d3770d768c1fbca" exitCode=0 Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.821707 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"233fffa5de6ee1e762a8824b32dec71fe3b7403332cc2d914d3770d768c1fbca"} Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.822016 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"45ccbed932001dc629a77de7e08e04a9cce25a78ac1e00aed407f7f4e1fa93a3"} Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.822042 4816 scope.go:117] "RemoveContainer" containerID="a1d4bcf565d8188182640fd6dfae19dbf3e118e4747e8a92039a28e6b5c3c95c" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.148003 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553850-v7tlf"] Mar 11 12:10:00 crc kubenswrapper[4816]: E0311 12:10:00.148821 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7a354c-a3ec-44fe-8e27-028abd12d7d9" containerName="oc" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.148838 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7a354c-a3ec-44fe-8e27-028abd12d7d9" containerName="oc" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.149002 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7a354c-a3ec-44fe-8e27-028abd12d7d9" containerName="oc" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.149426 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553850-v7tlf" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.152023 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.152175 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.153987 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.159942 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553850-v7tlf"] Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.310862 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-846l5\" (UniqueName: \"kubernetes.io/projected/5ac620ec-72d5-4603-852f-8ba3f1ad0e9b-kube-api-access-846l5\") pod \"auto-csr-approver-29553850-v7tlf\" (UID: \"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b\") " pod="openshift-infra/auto-csr-approver-29553850-v7tlf" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.412550 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-846l5\" (UniqueName: \"kubernetes.io/projected/5ac620ec-72d5-4603-852f-8ba3f1ad0e9b-kube-api-access-846l5\") pod \"auto-csr-approver-29553850-v7tlf\" (UID: \"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b\") " pod="openshift-infra/auto-csr-approver-29553850-v7tlf" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.440642 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-846l5\" (UniqueName: \"kubernetes.io/projected/5ac620ec-72d5-4603-852f-8ba3f1ad0e9b-kube-api-access-846l5\") pod \"auto-csr-approver-29553850-v7tlf\" (UID: \"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b\") " pod="openshift-infra/auto-csr-approver-29553850-v7tlf" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.474004 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553850-v7tlf" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.682718 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553850-v7tlf"] Mar 11 12:10:00 crc kubenswrapper[4816]: W0311 12:10:00.687912 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ac620ec_72d5_4603_852f_8ba3f1ad0e9b.slice/crio-ca70e525a30b28a721f12c5d635dc0f8922cf4f327d7e56a5a1758282eb289bd WatchSource:0}: Error finding container ca70e525a30b28a721f12c5d635dc0f8922cf4f327d7e56a5a1758282eb289bd: Status 404 returned error can't find the container with id ca70e525a30b28a721f12c5d635dc0f8922cf4f327d7e56a5a1758282eb289bd Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.953520 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553850-v7tlf" event={"ID":"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b","Type":"ContainerStarted","Data":"ca70e525a30b28a721f12c5d635dc0f8922cf4f327d7e56a5a1758282eb289bd"} Mar 11 12:10:02 crc kubenswrapper[4816]: E0311 12:10:02.306507 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ac620ec_72d5_4603_852f_8ba3f1ad0e9b.slice/crio-conmon-c3ad155fc5f3f7204d5fb77b61c79c6603bc6f42436d74dbc3171b2dbf21bbd2.scope\": RecentStats: unable to find data in memory cache]" Mar 11 12:10:02 crc kubenswrapper[4816]: I0311 12:10:02.967959 4816 generic.go:334] "Generic (PLEG): container finished" podID="5ac620ec-72d5-4603-852f-8ba3f1ad0e9b" containerID="c3ad155fc5f3f7204d5fb77b61c79c6603bc6f42436d74dbc3171b2dbf21bbd2" exitCode=0 Mar 11 12:10:02 crc kubenswrapper[4816]: I0311 12:10:02.968018 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553850-v7tlf" event={"ID":"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b","Type":"ContainerDied","Data":"c3ad155fc5f3f7204d5fb77b61c79c6603bc6f42436d74dbc3171b2dbf21bbd2"} Mar 11 12:10:04 crc kubenswrapper[4816]: I0311 12:10:04.263921 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553850-v7tlf" Mar 11 12:10:04 crc kubenswrapper[4816]: I0311 12:10:04.364207 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-846l5\" (UniqueName: \"kubernetes.io/projected/5ac620ec-72d5-4603-852f-8ba3f1ad0e9b-kube-api-access-846l5\") pod \"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b\" (UID: \"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b\") " Mar 11 12:10:04 crc kubenswrapper[4816]: I0311 12:10:04.375576 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac620ec-72d5-4603-852f-8ba3f1ad0e9b-kube-api-access-846l5" (OuterVolumeSpecName: "kube-api-access-846l5") pod "5ac620ec-72d5-4603-852f-8ba3f1ad0e9b" (UID: "5ac620ec-72d5-4603-852f-8ba3f1ad0e9b"). InnerVolumeSpecName "kube-api-access-846l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:10:04 crc kubenswrapper[4816]: I0311 12:10:04.466270 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-846l5\" (UniqueName: \"kubernetes.io/projected/5ac620ec-72d5-4603-852f-8ba3f1ad0e9b-kube-api-access-846l5\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:04 crc kubenswrapper[4816]: I0311 12:10:04.981627 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553850-v7tlf" event={"ID":"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b","Type":"ContainerDied","Data":"ca70e525a30b28a721f12c5d635dc0f8922cf4f327d7e56a5a1758282eb289bd"} Mar 11 12:10:04 crc kubenswrapper[4816]: I0311 12:10:04.981668 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553850-v7tlf" Mar 11 12:10:04 crc kubenswrapper[4816]: I0311 12:10:04.981672 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca70e525a30b28a721f12c5d635dc0f8922cf4f327d7e56a5a1758282eb289bd" Mar 11 12:10:05 crc kubenswrapper[4816]: I0311 12:10:05.333652 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553844-df99q"] Mar 11 12:10:05 crc kubenswrapper[4816]: I0311 12:10:05.337548 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553844-df99q"] Mar 11 12:10:06 crc kubenswrapper[4816]: I0311 12:10:06.141412 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16125795-8697-470d-bc37-1ab8f6e31af1" path="/var/lib/kubelet/pods/16125795-8697-470d-bc37-1ab8f6e31af1/volumes" Mar 11 12:10:34 crc kubenswrapper[4816]: I0311 12:10:34.495516 4816 scope.go:117] "RemoveContainer" containerID="3d89e5845eb14d7e6c90a432b751164398e20a4fe55d6026ce8f8ec622962660" Mar 11 12:10:37 crc kubenswrapper[4816]: I0311 12:10:37.283835 4816 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.228481 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dkh2h"] Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.234378 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.234404 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="nbdb" containerID="cri-o://45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.234554 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="northd" containerID="cri-o://62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.234785 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-node" containerID="cri-o://9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.234853 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-acl-logging" containerID="cri-o://bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.234812 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="sbdb" containerID="cri-o://a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.234312 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-controller" containerID="cri-o://ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.277549 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovnkube-controller" containerID="cri-o://6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.576992 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dkh2h_8fbe3bb6-8bf9-40b5-8f4f-0d136e285528/ovn-acl-logging/0.log" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.578130 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dkh2h_8fbe3bb6-8bf9-40b5-8f4f-0d136e285528/ovn-controller/0.log" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.578792 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658442 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-systemd-units\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658487 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-slash\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658504 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-openvswitch\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658519 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-netd\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658538 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-netns\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658550 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-etc-openvswitch\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658568 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-ovn-kubernetes\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658587 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-node-log\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658605 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658630 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj5rk\" (UniqueName: \"kubernetes.io/projected/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-kube-api-access-dj5rk\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658602 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658638 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658663 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-var-lib-cni-networks-ovn-kubernetes\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658668 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-slash" (OuterVolumeSpecName: "host-slash") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658686 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-script-lib\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658729 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-ovn\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658750 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-log-socket\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658775 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-kubelet\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658800 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovn-node-metrics-cert\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658815 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-bin\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658835 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-var-lib-openvswitch\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658687 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-node-log" (OuterVolumeSpecName: "node-log") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658883 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-log-socket" (OuterVolumeSpecName: "log-socket") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658910 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658851 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-env-overrides\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659033 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-config\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659077 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-systemd\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659132 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659160 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658703 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658735 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659175 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658755 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658808 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658812 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658859 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659415 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659843 4816 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659883 4816 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659898 4816 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659913 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659935 4816 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659958 4816 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-slash\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659976 4816 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659995 4816 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660011 4816 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660032 4816 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660051 4816 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660068 4816 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-node-log\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660087 4816 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660104 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660119 4816 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660131 4816 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-log-socket\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660143 4816 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.668002 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-kube-api-access-dj5rk" (OuterVolumeSpecName: "kube-api-access-dj5rk") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "kube-api-access-dj5rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.671443 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685415 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hhq62"] Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685649 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="nbdb" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685664 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="nbdb" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685674 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovnkube-controller" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685680 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovnkube-controller" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685688 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kubecfg-setup" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685695 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kubecfg-setup" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685706 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685712 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685723 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac620ec-72d5-4603-852f-8ba3f1ad0e9b" containerName="oc" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685731 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac620ec-72d5-4603-852f-8ba3f1ad0e9b" containerName="oc" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685738 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-controller" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685744 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-controller" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685751 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="sbdb" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685757 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="sbdb" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685765 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="northd" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685770 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="northd" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685777 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-acl-logging" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685783 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-acl-logging" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685791 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-node" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685796 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-node" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685886 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685896 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="northd" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685903 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="sbdb" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685912 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-controller" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685918 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovnkube-controller" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685924 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac620ec-72d5-4603-852f-8ba3f1ad0e9b" containerName="oc" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685932 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-acl-logging" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685939 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-node" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685946 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="nbdb" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.687639 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.689778 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761202 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-etc-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761259 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovn-node-metrics-cert\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761281 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-log-socket\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761302 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovnkube-script-lib\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761412 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovnkube-config\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761443 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-var-lib-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761572 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-ovn\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761654 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-kubelet\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761691 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-run-netns\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761706 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-systemd\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761737 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xlxm\" (UniqueName: \"kubernetes.io/projected/37b06e28-edcf-42e0-b392-7a1bc070f06d-kube-api-access-5xlxm\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761798 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-slash\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761843 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761867 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-cni-bin\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761889 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761997 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-node-log\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.762027 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-run-ovn-kubernetes\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.762142 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-cni-netd\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.762186 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-env-overrides\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.762212 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-systemd-units\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.762329 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.762359 4816 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.762373 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj5rk\" (UniqueName: \"kubernetes.io/projected/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-kube-api-access-dj5rk\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.863816 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-ovn\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.863884 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-kubelet\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.863912 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-run-netns\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.863925 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-systemd\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.863932 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-ovn\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.863999 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-run-netns\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864006 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-kubelet\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.863946 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xlxm\" (UniqueName: \"kubernetes.io/projected/37b06e28-edcf-42e0-b392-7a1bc070f06d-kube-api-access-5xlxm\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864074 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-slash\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864093 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-systemd\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864141 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864121 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864173 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-slash\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864183 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-cni-bin\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864204 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864235 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-node-log\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864272 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-run-ovn-kubernetes\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864295 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864310 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-node-log\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864326 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-run-ovn-kubernetes\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864345 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-cni-netd\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864366 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-env-overrides\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864388 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-systemd-units\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864418 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-cni-netd\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864407 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-cni-bin\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864489 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-etc-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864471 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-etc-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864450 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-systemd-units\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864684 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovn-node-metrics-cert\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864806 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-log-socket\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864897 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovnkube-script-lib\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864935 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-env-overrides\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864948 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-log-socket\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864955 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovnkube-config\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.865076 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-var-lib-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.865240 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-var-lib-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.865410 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovnkube-script-lib\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.867004 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovnkube-config\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.867622 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovn-node-metrics-cert\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.881624 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xlxm\" (UniqueName: \"kubernetes.io/projected/37b06e28-edcf-42e0-b392-7a1bc070f06d-kube-api-access-5xlxm\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.011195 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.267048 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dkh2h_8fbe3bb6-8bf9-40b5-8f4f-0d136e285528/ovn-acl-logging/0.log" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268000 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dkh2h_8fbe3bb6-8bf9-40b5-8f4f-0d136e285528/ovn-controller/0.log" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268393 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" exitCode=0 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268430 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" exitCode=0 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268438 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" exitCode=0 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268446 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" exitCode=0 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268452 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" exitCode=0 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268459 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" exitCode=0 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268466 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" exitCode=143 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268472 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" exitCode=143 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268521 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268548 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268558 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268566 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268577 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268586 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268596 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268605 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268610 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268617 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268623 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268629 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268635 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268640 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268644 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268649 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268654 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268659 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268664 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268670 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268679 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268686 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268692 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268698 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268705 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268712 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268718 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268724 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268730 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"062d70cdf9dcd40a3c2ebd1f383f192eaa42464d705740bb35123cc3c8899d9b"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268745 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268752 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268757 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268763 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268768 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268773 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268778 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268783 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268788 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268801 4816 scope.go:117] "RemoveContainer" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268952 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.283361 4816 generic.go:334] "Generic (PLEG): container finished" podID="37b06e28-edcf-42e0-b392-7a1bc070f06d" containerID="d3c3478146da9a34b25c91a65adac491a71029a1a952a7c80271260c570ded3a" exitCode=0 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.283432 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerDied","Data":"d3c3478146da9a34b25c91a65adac491a71029a1a952a7c80271260c570ded3a"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.283463 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"29e047aa3cc913eb06b09bb0c8d06dd7acabc7b022d4b8b55f808f2caefbb5c4"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.285335 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mdbt5_a30d3e88-e081-4303-a202-1b7505629539/kube-multus/0.log" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.285406 4816 generic.go:334] "Generic (PLEG): container finished" podID="a30d3e88-e081-4303-a202-1b7505629539" containerID="cd735f189f4c35270aee2182d3a2eecb596607b294cc97d17559e7b5727e8dba" exitCode=2 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.285464 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mdbt5" event={"ID":"a30d3e88-e081-4303-a202-1b7505629539","Type":"ContainerDied","Data":"cd735f189f4c35270aee2182d3a2eecb596607b294cc97d17559e7b5727e8dba"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.286310 4816 scope.go:117] "RemoveContainer" containerID="cd735f189f4c35270aee2182d3a2eecb596607b294cc97d17559e7b5727e8dba" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.300983 4816 scope.go:117] "RemoveContainer" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.305816 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dkh2h"] Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.310418 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dkh2h"] Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.334425 4816 scope.go:117] "RemoveContainer" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.376142 4816 scope.go:117] "RemoveContainer" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.401608 4816 scope.go:117] "RemoveContainer" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.417024 4816 scope.go:117] "RemoveContainer" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.435043 4816 scope.go:117] "RemoveContainer" containerID="bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.455719 4816 scope.go:117] "RemoveContainer" containerID="ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.473768 4816 scope.go:117] "RemoveContainer" containerID="8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.508478 4816 scope.go:117] "RemoveContainer" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.509598 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": container with ID starting with 6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e not found: ID does not exist" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.509649 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} err="failed to get container status \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": rpc error: code = NotFound desc = could not find container \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": container with ID starting with 6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.509681 4816 scope.go:117] "RemoveContainer" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.510001 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": container with ID starting with a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586 not found: ID does not exist" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.510030 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} err="failed to get container status \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": rpc error: code = NotFound desc = could not find container \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": container with ID starting with a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.510043 4816 scope.go:117] "RemoveContainer" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.510537 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": container with ID starting with 45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63 not found: ID does not exist" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.510554 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} err="failed to get container status \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": rpc error: code = NotFound desc = could not find container \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": container with ID starting with 45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.510568 4816 scope.go:117] "RemoveContainer" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.511362 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": container with ID starting with 62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46 not found: ID does not exist" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.511397 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} err="failed to get container status \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": rpc error: code = NotFound desc = could not find container \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": container with ID starting with 62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.511421 4816 scope.go:117] "RemoveContainer" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.512380 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": container with ID starting with bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7 not found: ID does not exist" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.512412 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} err="failed to get container status \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": rpc error: code = NotFound desc = could not find container \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": container with ID starting with bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.512430 4816 scope.go:117] "RemoveContainer" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.512724 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": container with ID starting with 9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2 not found: ID does not exist" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.512779 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} err="failed to get container status \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": rpc error: code = NotFound desc = could not find container \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": container with ID starting with 9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.512813 4816 scope.go:117] "RemoveContainer" containerID="bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.513109 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": container with ID starting with bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47 not found: ID does not exist" containerID="bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.513133 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} err="failed to get container status \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": rpc error: code = NotFound desc = could not find container \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": container with ID starting with bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.513151 4816 scope.go:117] "RemoveContainer" containerID="ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.513360 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": container with ID starting with ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b not found: ID does not exist" containerID="ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.513381 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} err="failed to get container status \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": rpc error: code = NotFound desc = could not find container \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": container with ID starting with ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.513397 4816 scope.go:117] "RemoveContainer" containerID="8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.526907 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": container with ID starting with 8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5 not found: ID does not exist" containerID="8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.526979 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} err="failed to get container status \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": rpc error: code = NotFound desc = could not find container \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": container with ID starting with 8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.527016 4816 scope.go:117] "RemoveContainer" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.529661 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} err="failed to get container status \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": rpc error: code = NotFound desc = could not find container \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": container with ID starting with 6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.529718 4816 scope.go:117] "RemoveContainer" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.530290 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} err="failed to get container status \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": rpc error: code = NotFound desc = could not find container \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": container with ID starting with a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.530314 4816 scope.go:117] "RemoveContainer" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.530704 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} err="failed to get container status \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": rpc error: code = NotFound desc = could not find container \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": container with ID starting with 45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.530721 4816 scope.go:117] "RemoveContainer" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.530995 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} err="failed to get container status \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": rpc error: code = NotFound desc = could not find container \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": container with ID starting with 62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.531013 4816 scope.go:117] "RemoveContainer" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.531297 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} err="failed to get container status \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": rpc error: code = NotFound desc = could not find container \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": container with ID starting with bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.531316 4816 scope.go:117] "RemoveContainer" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.531530 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} err="failed to get container status \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": rpc error: code = NotFound desc = could not find container \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": container with ID starting with 9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.531548 4816 scope.go:117] "RemoveContainer" containerID="bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.531824 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} err="failed to get container status \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": rpc error: code = NotFound desc = could not find container \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": container with ID starting with bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.531846 4816 scope.go:117] "RemoveContainer" containerID="ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532079 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} err="failed to get container status \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": rpc error: code = NotFound desc = could not find container \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": container with ID starting with ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532095 4816 scope.go:117] "RemoveContainer" containerID="8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532316 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} err="failed to get container status \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": rpc error: code = NotFound desc = could not find container \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": container with ID starting with 8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532332 4816 scope.go:117] "RemoveContainer" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532540 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} err="failed to get container status \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": rpc error: code = NotFound desc = could not find container \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": container with ID starting with 6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532558 4816 scope.go:117] "RemoveContainer" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532756 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} err="failed to get container status \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": rpc error: code = NotFound desc = could not find container \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": container with ID starting with a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532768 4816 scope.go:117] "RemoveContainer" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.533455 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} err="failed to get container status \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": rpc error: code = NotFound desc = could not find container \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": container with ID starting with 45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.533473 4816 scope.go:117] "RemoveContainer" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.533774 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} err="failed to get container status \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": rpc error: code = NotFound desc = could not find container \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": container with ID starting with 62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.533821 4816 scope.go:117] "RemoveContainer" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534107 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} err="failed to get container status \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": rpc error: code = NotFound desc = could not find container \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": container with ID starting with bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534132 4816 scope.go:117] "RemoveContainer" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534390 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} err="failed to get container status \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": rpc error: code = NotFound desc = could not find container \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": container with ID starting with 9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534409 4816 scope.go:117] "RemoveContainer" containerID="bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534633 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} err="failed to get container status \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": rpc error: code = NotFound desc = could not find container \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": container with ID starting with bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534647 4816 scope.go:117] "RemoveContainer" containerID="ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534968 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} err="failed to get container status \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": rpc error: code = NotFound desc = could not find container \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": container with ID starting with ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534988 4816 scope.go:117] "RemoveContainer" containerID="8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.535293 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} err="failed to get container status \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": rpc error: code = NotFound desc = could not find container \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": container with ID starting with 8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.535317 4816 scope.go:117] "RemoveContainer" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.535687 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} err="failed to get container status \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": rpc error: code = NotFound desc = could not find container \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": container with ID starting with 6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.535705 4816 scope.go:117] "RemoveContainer" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.535981 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} err="failed to get container status \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": rpc error: code = NotFound desc = could not find container \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": container with ID starting with a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.535999 4816 scope.go:117] "RemoveContainer" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.536262 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} err="failed to get container status \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": rpc error: code = NotFound desc = could not find container \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": container with ID starting with 45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.536283 4816 scope.go:117] "RemoveContainer" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.538555 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} err="failed to get container status \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": rpc error: code = NotFound desc = could not find container \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": container with ID starting with 62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.538585 4816 scope.go:117] "RemoveContainer" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.539002 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} err="failed to get container status \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": rpc error: code = NotFound desc = could not find container \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": container with ID starting with bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.539063 4816 scope.go:117] "RemoveContainer" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.539491 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} err="failed to get container status \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": rpc error: code = NotFound desc = could not find container \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": container with ID starting with 9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.539535 4816 scope.go:117] "RemoveContainer" containerID="bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.539923 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} err="failed to get container status \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": rpc error: code = NotFound desc = could not find container \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": container with ID starting with bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.539944 4816 scope.go:117] "RemoveContainer" containerID="ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.540237 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} err="failed to get container status \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": rpc error: code = NotFound desc = could not find container \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": container with ID starting with ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.540284 4816 scope.go:117] "RemoveContainer" containerID="8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.540548 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} err="failed to get container status \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": rpc error: code = NotFound desc = could not find container \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": container with ID starting with 8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.540569 4816 scope.go:117] "RemoveContainer" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.540954 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} err="failed to get container status \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": rpc error: code = NotFound desc = could not find container \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": container with ID starting with 6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.541053 4816 scope.go:117] "RemoveContainer" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.541395 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} err="failed to get container status \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": rpc error: code = NotFound desc = could not find container \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": container with ID starting with a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.541418 4816 scope.go:117] "RemoveContainer" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.541698 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} err="failed to get container status \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": rpc error: code = NotFound desc = could not find container \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": container with ID starting with 45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.541772 4816 scope.go:117] "RemoveContainer" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.542079 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} err="failed to get container status \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": rpc error: code = NotFound desc = could not find container \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": container with ID starting with 62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.542101 4816 scope.go:117] "RemoveContainer" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.542389 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} err="failed to get container status \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": rpc error: code = NotFound desc = could not find container \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": container with ID starting with bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.542481 4816 scope.go:117] "RemoveContainer" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.542793 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} err="failed to get container status \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": rpc error: code = NotFound desc = could not find container \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": container with ID starting with 9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2 not found: ID does not exist" Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.140512 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" path="/var/lib/kubelet/pods/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528/volumes" Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.300050 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mdbt5_a30d3e88-e081-4303-a202-1b7505629539/kube-multus/0.log" Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.300210 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mdbt5" event={"ID":"a30d3e88-e081-4303-a202-1b7505629539","Type":"ContainerStarted","Data":"4d4d255d20dc4eee3b47010d5f77933f5ae0bf035b74f040a7ea1d371bea82d5"} Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.305451 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"9737746047f66b88abbbbdc3e85f0bb5e80305c21ae69d4acf8709eaed03e483"} Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.305481 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"afcba8530bd65f111d0abc28f8fe448dad135747684fb008363c43368a57a5a3"} Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.305513 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"66cba12be44bdddf5b48d542c633822cfa39f6f6d09a7e4ee54d4fcf181fa63d"} Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.305526 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"54b18462010b3fbe4d11a8e256bad80b1c6d1fdc265b08f1e48f0874413543aa"} Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.305535 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"d27f9f31ccafad332da05428465c1af6015204ffb5d97aa20dbfdfe1c590d017"} Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.305544 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"18c3ed790a7d28d7dc1850b94b9c3bbe9e1f81d84eab099b53d6bf1aad414c53"} Mar 11 12:10:52 crc kubenswrapper[4816]: I0311 12:10:52.329863 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"8b1d050c2e528cf69a04cd8dac85eefe5e81a78c693234353db0f29272f52c47"} Mar 11 12:10:53 crc kubenswrapper[4816]: I0311 12:10:53.937604 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-69hv5"] Mar 11 12:10:53 crc kubenswrapper[4816]: I0311 12:10:53.939203 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:53 crc kubenswrapper[4816]: I0311 12:10:53.941744 4816 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-zmgc9" Mar 11 12:10:53 crc kubenswrapper[4816]: I0311 12:10:53.941783 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 11 12:10:53 crc kubenswrapper[4816]: I0311 12:10:53.941926 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 11 12:10:53 crc kubenswrapper[4816]: I0311 12:10:53.942290 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.048932 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7462073d-1852-4032-87bc-e0a4b973f92f-crc-storage\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.048981 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp8n8\" (UniqueName: \"kubernetes.io/projected/7462073d-1852-4032-87bc-e0a4b973f92f-kube-api-access-zp8n8\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.049027 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7462073d-1852-4032-87bc-e0a4b973f92f-node-mnt\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.149801 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7462073d-1852-4032-87bc-e0a4b973f92f-node-mnt\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.149935 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7462073d-1852-4032-87bc-e0a4b973f92f-crc-storage\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.149984 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp8n8\" (UniqueName: \"kubernetes.io/projected/7462073d-1852-4032-87bc-e0a4b973f92f-kube-api-access-zp8n8\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.150227 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7462073d-1852-4032-87bc-e0a4b973f92f-node-mnt\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.150995 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7462073d-1852-4032-87bc-e0a4b973f92f-crc-storage\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.173928 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp8n8\" (UniqueName: \"kubernetes.io/projected/7462073d-1852-4032-87bc-e0a4b973f92f-kube-api-access-zp8n8\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.262750 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: E0311 12:10:54.303621 4816 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(82b8e2422033b55e0e30d21316f885f5b691a6fcebc8a4745a9f45991a879231): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 12:10:54 crc kubenswrapper[4816]: E0311 12:10:54.303793 4816 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(82b8e2422033b55e0e30d21316f885f5b691a6fcebc8a4745a9f45991a879231): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: E0311 12:10:54.303845 4816 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(82b8e2422033b55e0e30d21316f885f5b691a6fcebc8a4745a9f45991a879231): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: E0311 12:10:54.304039 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-69hv5_crc-storage(7462073d-1852-4032-87bc-e0a4b973f92f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-69hv5_crc-storage(7462073d-1852-4032-87bc-e0a4b973f92f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(82b8e2422033b55e0e30d21316f885f5b691a6fcebc8a4745a9f45991a879231): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-69hv5" podUID="7462073d-1852-4032-87bc-e0a4b973f92f" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.366279 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"9f1b1c7e6daeeba219bc9ee757df3614a1d23c2e5b414924e156964abdb003ac"} Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.367002 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.367043 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.367061 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.404534 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.408157 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.417092 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" podStartSLOduration=7.417065085 podStartE2EDuration="7.417065085s" podCreationTimestamp="2026-03-11 12:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:10:55.409013043 +0000 UTC m=+742.000277050" watchObservedRunningTime="2026-03-11 12:10:55.417065085 +0000 UTC m=+742.008329062" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.474225 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-69hv5"] Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.474385 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.474824 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:55 crc kubenswrapper[4816]: E0311 12:10:55.502061 4816 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(d3e8a72107756c6a010a0556a446d446f255b7b50caf8d168e9fd1eb54845a1a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 12:10:55 crc kubenswrapper[4816]: E0311 12:10:55.502623 4816 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(d3e8a72107756c6a010a0556a446d446f255b7b50caf8d168e9fd1eb54845a1a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:55 crc kubenswrapper[4816]: E0311 12:10:55.502655 4816 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(d3e8a72107756c6a010a0556a446d446f255b7b50caf8d168e9fd1eb54845a1a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:55 crc kubenswrapper[4816]: E0311 12:10:55.502737 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-69hv5_crc-storage(7462073d-1852-4032-87bc-e0a4b973f92f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-69hv5_crc-storage(7462073d-1852-4032-87bc-e0a4b973f92f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(d3e8a72107756c6a010a0556a446d446f255b7b50caf8d168e9fd1eb54845a1a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-69hv5" podUID="7462073d-1852-4032-87bc-e0a4b973f92f" Mar 11 12:11:07 crc kubenswrapper[4816]: I0311 12:11:07.130048 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:11:07 crc kubenswrapper[4816]: I0311 12:11:07.131446 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:11:07 crc kubenswrapper[4816]: I0311 12:11:07.350889 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-69hv5"] Mar 11 12:11:07 crc kubenswrapper[4816]: I0311 12:11:07.357099 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:11:07 crc kubenswrapper[4816]: I0311 12:11:07.472693 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-69hv5" event={"ID":"7462073d-1852-4032-87bc-e0a4b973f92f","Type":"ContainerStarted","Data":"521bfddd61bb4df175bc1fb7dfc68b6912550738c88aba8b5a47a1a00c59a39f"} Mar 11 12:11:09 crc kubenswrapper[4816]: I0311 12:11:09.486718 4816 generic.go:334] "Generic (PLEG): container finished" podID="7462073d-1852-4032-87bc-e0a4b973f92f" containerID="0ee4f053b0c8963adb31e4e6ffaf9c7c100dafccbfa493c26f5254141c13917c" exitCode=0 Mar 11 12:11:09 crc kubenswrapper[4816]: I0311 12:11:09.486773 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-69hv5" event={"ID":"7462073d-1852-4032-87bc-e0a4b973f92f","Type":"ContainerDied","Data":"0ee4f053b0c8963adb31e4e6ffaf9c7c100dafccbfa493c26f5254141c13917c"} Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.753565 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.798039 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7462073d-1852-4032-87bc-e0a4b973f92f-crc-storage\") pod \"7462073d-1852-4032-87bc-e0a4b973f92f\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.798499 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp8n8\" (UniqueName: \"kubernetes.io/projected/7462073d-1852-4032-87bc-e0a4b973f92f-kube-api-access-zp8n8\") pod \"7462073d-1852-4032-87bc-e0a4b973f92f\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.798581 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7462073d-1852-4032-87bc-e0a4b973f92f-node-mnt\") pod \"7462073d-1852-4032-87bc-e0a4b973f92f\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.798880 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7462073d-1852-4032-87bc-e0a4b973f92f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "7462073d-1852-4032-87bc-e0a4b973f92f" (UID: "7462073d-1852-4032-87bc-e0a4b973f92f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.803384 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7462073d-1852-4032-87bc-e0a4b973f92f-kube-api-access-zp8n8" (OuterVolumeSpecName: "kube-api-access-zp8n8") pod "7462073d-1852-4032-87bc-e0a4b973f92f" (UID: "7462073d-1852-4032-87bc-e0a4b973f92f"). InnerVolumeSpecName "kube-api-access-zp8n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.810163 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7462073d-1852-4032-87bc-e0a4b973f92f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "7462073d-1852-4032-87bc-e0a4b973f92f" (UID: "7462073d-1852-4032-87bc-e0a4b973f92f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.900180 4816 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7462073d-1852-4032-87bc-e0a4b973f92f-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.900227 4816 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7462073d-1852-4032-87bc-e0a4b973f92f-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.900240 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp8n8\" (UniqueName: \"kubernetes.io/projected/7462073d-1852-4032-87bc-e0a4b973f92f-kube-api-access-zp8n8\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:11 crc kubenswrapper[4816]: I0311 12:11:11.503721 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-69hv5" event={"ID":"7462073d-1852-4032-87bc-e0a4b973f92f","Type":"ContainerDied","Data":"521bfddd61bb4df175bc1fb7dfc68b6912550738c88aba8b5a47a1a00c59a39f"} Mar 11 12:11:11 crc kubenswrapper[4816]: I0311 12:11:11.503781 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="521bfddd61bb4df175bc1fb7dfc68b6912550738c88aba8b5a47a1a00c59a39f" Mar 11 12:11:11 crc kubenswrapper[4816]: I0311 12:11:11.503849 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.877319 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp"] Mar 11 12:11:18 crc kubenswrapper[4816]: E0311 12:11:18.878061 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7462073d-1852-4032-87bc-e0a4b973f92f" containerName="storage" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.878078 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7462073d-1852-4032-87bc-e0a4b973f92f" containerName="storage" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.878212 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7462073d-1852-4032-87bc-e0a4b973f92f" containerName="storage" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.879241 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.882172 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.883773 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp"] Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.903412 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.903553 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jjk7\" (UniqueName: \"kubernetes.io/projected/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-kube-api-access-4jjk7\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.903962 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.004484 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.004542 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jjk7\" (UniqueName: \"kubernetes.io/projected/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-kube-api-access-4jjk7\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.004586 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.005063 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.005132 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.032449 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.032716 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jjk7\" (UniqueName: \"kubernetes.io/projected/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-kube-api-access-4jjk7\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.198703 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.620030 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp"] Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.567305 4816 generic.go:334] "Generic (PLEG): container finished" podID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerID="01ba9569641c593f0a0de55cac3b2f8df054eba1c3c74eeda2593404e7f454af" exitCode=0 Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.567398 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" event={"ID":"2a03942f-8b0e-4041-8843-ad5e6cedc6b0","Type":"ContainerDied","Data":"01ba9569641c593f0a0de55cac3b2f8df054eba1c3c74eeda2593404e7f454af"} Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.567774 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" event={"ID":"2a03942f-8b0e-4041-8843-ad5e6cedc6b0","Type":"ContainerStarted","Data":"fe0c8bc583ce4621ae0ee36bf084ea55f1012d84cd0125dadd104db67728e406"} Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.899117 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c29dr"] Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.904300 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.915769 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c29dr"] Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.930737 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-catalog-content\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.930810 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqbqq\" (UniqueName: \"kubernetes.io/projected/461ff7ea-1256-4928-9425-ed9840dc4eda-kube-api-access-kqbqq\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.930861 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-utilities\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:21 crc kubenswrapper[4816]: I0311 12:11:21.031691 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqbqq\" (UniqueName: \"kubernetes.io/projected/461ff7ea-1256-4928-9425-ed9840dc4eda-kube-api-access-kqbqq\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:21 crc kubenswrapper[4816]: I0311 12:11:21.031764 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-utilities\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:21 crc kubenswrapper[4816]: I0311 12:11:21.031842 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-catalog-content\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:21 crc kubenswrapper[4816]: I0311 12:11:21.032384 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-utilities\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:21 crc kubenswrapper[4816]: I0311 12:11:21.032416 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-catalog-content\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:21 crc kubenswrapper[4816]: I0311 12:11:21.068581 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqbqq\" (UniqueName: \"kubernetes.io/projected/461ff7ea-1256-4928-9425-ed9840dc4eda-kube-api-access-kqbqq\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:21 crc kubenswrapper[4816]: I0311 12:11:21.231764 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:22 crc kubenswrapper[4816]: I0311 12:11:22.326397 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c29dr"] Mar 11 12:11:22 crc kubenswrapper[4816]: I0311 12:11:22.580090 4816 generic.go:334] "Generic (PLEG): container finished" podID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerID="003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3" exitCode=0 Mar 11 12:11:22 crc kubenswrapper[4816]: I0311 12:11:22.580161 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c29dr" event={"ID":"461ff7ea-1256-4928-9425-ed9840dc4eda","Type":"ContainerDied","Data":"003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3"} Mar 11 12:11:22 crc kubenswrapper[4816]: I0311 12:11:22.580195 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c29dr" event={"ID":"461ff7ea-1256-4928-9425-ed9840dc4eda","Type":"ContainerStarted","Data":"f55816d844fceb14f1b2e972a2c499707de36af2a78a8f7ba4306a161a99924b"} Mar 11 12:11:22 crc kubenswrapper[4816]: I0311 12:11:22.582631 4816 generic.go:334] "Generic (PLEG): container finished" podID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerID="b1b201d6c2d8d849456a7f9a96ac6a34ca3bbe4fc702ab0e66e8390510c6f970" exitCode=0 Mar 11 12:11:22 crc kubenswrapper[4816]: I0311 12:11:22.582661 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" event={"ID":"2a03942f-8b0e-4041-8843-ad5e6cedc6b0","Type":"ContainerDied","Data":"b1b201d6c2d8d849456a7f9a96ac6a34ca3bbe4fc702ab0e66e8390510c6f970"} Mar 11 12:11:23 crc kubenswrapper[4816]: I0311 12:11:23.589660 4816 generic.go:334] "Generic (PLEG): container finished" podID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerID="a03404bb8b9128b37e1393dcbcd23ba56c893a0ec7a237e8c2ff7880bcb37b34" exitCode=0 Mar 11 12:11:23 crc kubenswrapper[4816]: I0311 12:11:23.589717 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" event={"ID":"2a03942f-8b0e-4041-8843-ad5e6cedc6b0","Type":"ContainerDied","Data":"a03404bb8b9128b37e1393dcbcd23ba56c893a0ec7a237e8c2ff7880bcb37b34"} Mar 11 12:11:23 crc kubenswrapper[4816]: I0311 12:11:23.592208 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c29dr" event={"ID":"461ff7ea-1256-4928-9425-ed9840dc4eda","Type":"ContainerStarted","Data":"4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a"} Mar 11 12:11:24 crc kubenswrapper[4816]: I0311 12:11:24.603113 4816 generic.go:334] "Generic (PLEG): container finished" podID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerID="4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a" exitCode=0 Mar 11 12:11:24 crc kubenswrapper[4816]: I0311 12:11:24.603344 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c29dr" event={"ID":"461ff7ea-1256-4928-9425-ed9840dc4eda","Type":"ContainerDied","Data":"4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a"} Mar 11 12:11:24 crc kubenswrapper[4816]: I0311 12:11:24.948289 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:24 crc kubenswrapper[4816]: I0311 12:11:24.994590 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jjk7\" (UniqueName: \"kubernetes.io/projected/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-kube-api-access-4jjk7\") pod \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " Mar 11 12:11:24 crc kubenswrapper[4816]: I0311 12:11:24.994657 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-bundle\") pod \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " Mar 11 12:11:24 crc kubenswrapper[4816]: I0311 12:11:24.994707 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-util\") pod \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " Mar 11 12:11:24 crc kubenswrapper[4816]: I0311 12:11:24.995740 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-bundle" (OuterVolumeSpecName: "bundle") pod "2a03942f-8b0e-4041-8843-ad5e6cedc6b0" (UID: "2a03942f-8b0e-4041-8843-ad5e6cedc6b0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.003648 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-kube-api-access-4jjk7" (OuterVolumeSpecName: "kube-api-access-4jjk7") pod "2a03942f-8b0e-4041-8843-ad5e6cedc6b0" (UID: "2a03942f-8b0e-4041-8843-ad5e6cedc6b0"). InnerVolumeSpecName "kube-api-access-4jjk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.097139 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jjk7\" (UniqueName: \"kubernetes.io/projected/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-kube-api-access-4jjk7\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.097168 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.256010 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-util" (OuterVolumeSpecName: "util") pod "2a03942f-8b0e-4041-8843-ad5e6cedc6b0" (UID: "2a03942f-8b0e-4041-8843-ad5e6cedc6b0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.300211 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-util\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.616848 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" event={"ID":"2a03942f-8b0e-4041-8843-ad5e6cedc6b0","Type":"ContainerDied","Data":"fe0c8bc583ce4621ae0ee36bf084ea55f1012d84cd0125dadd104db67728e406"} Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.616904 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe0c8bc583ce4621ae0ee36bf084ea55f1012d84cd0125dadd104db67728e406" Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.616938 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:26 crc kubenswrapper[4816]: I0311 12:11:26.629089 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c29dr" event={"ID":"461ff7ea-1256-4928-9425-ed9840dc4eda","Type":"ContainerStarted","Data":"95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9"} Mar 11 12:11:26 crc kubenswrapper[4816]: I0311 12:11:26.660608 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c29dr" podStartSLOduration=3.642467768 podStartE2EDuration="6.660576774s" podCreationTimestamp="2026-03-11 12:11:20 +0000 UTC" firstStartedPulling="2026-03-11 12:11:22.58141548 +0000 UTC m=+769.172679457" lastFinishedPulling="2026-03-11 12:11:25.599524456 +0000 UTC m=+772.190788463" observedRunningTime="2026-03-11 12:11:26.653413108 +0000 UTC m=+773.244677155" watchObservedRunningTime="2026-03-11 12:11:26.660576774 +0000 UTC m=+773.251840781" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.240361 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-g59xq"] Mar 11 12:11:29 crc kubenswrapper[4816]: E0311 12:11:29.240926 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerName="extract" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.240940 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerName="extract" Mar 11 12:11:29 crc kubenswrapper[4816]: E0311 12:11:29.240960 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerName="pull" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.240966 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerName="pull" Mar 11 12:11:29 crc kubenswrapper[4816]: E0311 12:11:29.240977 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerName="util" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.240983 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerName="util" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.241088 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerName="extract" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.241478 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.243795 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-w7sf4" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.250434 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.251545 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.258701 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-g59xq"] Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.354740 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldlnf\" (UniqueName: \"kubernetes.io/projected/c1f09ebe-c0e1-415c-9ea9-42fc42240e94-kube-api-access-ldlnf\") pod \"nmstate-operator-796d4cfff4-g59xq\" (UID: \"c1f09ebe-c0e1-415c-9ea9-42fc42240e94\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.456107 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldlnf\" (UniqueName: \"kubernetes.io/projected/c1f09ebe-c0e1-415c-9ea9-42fc42240e94-kube-api-access-ldlnf\") pod \"nmstate-operator-796d4cfff4-g59xq\" (UID: \"c1f09ebe-c0e1-415c-9ea9-42fc42240e94\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.480359 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldlnf\" (UniqueName: \"kubernetes.io/projected/c1f09ebe-c0e1-415c-9ea9-42fc42240e94-kube-api-access-ldlnf\") pod \"nmstate-operator-796d4cfff4-g59xq\" (UID: \"c1f09ebe-c0e1-415c-9ea9-42fc42240e94\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.559879 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.892898 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-g59xq"] Mar 11 12:11:30 crc kubenswrapper[4816]: I0311 12:11:30.653858 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" event={"ID":"c1f09ebe-c0e1-415c-9ea9-42fc42240e94","Type":"ContainerStarted","Data":"368c9f058bcf11df278bc0ef15c7f90449d2b78f39b6a1fe4f8d949c96b6155a"} Mar 11 12:11:31 crc kubenswrapper[4816]: I0311 12:11:31.232676 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:31 crc kubenswrapper[4816]: I0311 12:11:31.233068 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:32 crc kubenswrapper[4816]: I0311 12:11:32.293966 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c29dr" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="registry-server" probeResult="failure" output=< Mar 11 12:11:32 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:11:32 crc kubenswrapper[4816]: > Mar 11 12:11:33 crc kubenswrapper[4816]: I0311 12:11:33.698402 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" event={"ID":"c1f09ebe-c0e1-415c-9ea9-42fc42240e94","Type":"ContainerStarted","Data":"154082c663c85dd386ba1f00d26dbf18b99c080627086950c172f7b5f6ec450a"} Mar 11 12:11:33 crc kubenswrapper[4816]: I0311 12:11:33.728973 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" podStartSLOduration=1.275402976 podStartE2EDuration="4.72894315s" podCreationTimestamp="2026-03-11 12:11:29 +0000 UTC" firstStartedPulling="2026-03-11 12:11:29.902572693 +0000 UTC m=+776.493836660" lastFinishedPulling="2026-03-11 12:11:33.356112867 +0000 UTC m=+779.947376834" observedRunningTime="2026-03-11 12:11:33.725011658 +0000 UTC m=+780.316275625" watchObservedRunningTime="2026-03-11 12:11:33.72894315 +0000 UTC m=+780.320207127" Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.888422 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd"] Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.890071 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.895144 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-v2wv2" Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.900128 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-xq48v"] Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.900969 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.903361 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.914462 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-xq48v"] Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.931179 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-47rs2"] Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.932121 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.953553 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd"] Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.019645 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1b664fad-a0fa-4442-bed2-3316eafbb78c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-xq48v\" (UID: \"1b664fad-a0fa-4442-bed2-3316eafbb78c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.019685 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4fls\" (UniqueName: \"kubernetes.io/projected/7fb0dcd0-9411-49d6-a997-79d2099b2462-kube-api-access-m4fls\") pod \"nmstate-metrics-9b8c8685d-2snpd\" (UID: \"7fb0dcd0-9411-49d6-a997-79d2099b2462\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.019717 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj6pt\" (UniqueName: \"kubernetes.io/projected/1b664fad-a0fa-4442-bed2-3316eafbb78c-kube-api-access-lj6pt\") pod \"nmstate-webhook-5f558f5558-xq48v\" (UID: \"1b664fad-a0fa-4442-bed2-3316eafbb78c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.034681 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk"] Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.035454 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.037524 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.042004 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.042629 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mpl9k" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.051564 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk"] Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.120911 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-ovs-socket\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.121020 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b958v\" (UniqueName: \"kubernetes.io/projected/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-kube-api-access-b958v\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.121065 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1b664fad-a0fa-4442-bed2-3316eafbb78c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-xq48v\" (UID: \"1b664fad-a0fa-4442-bed2-3316eafbb78c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.121083 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4fls\" (UniqueName: \"kubernetes.io/projected/7fb0dcd0-9411-49d6-a997-79d2099b2462-kube-api-access-m4fls\") pod \"nmstate-metrics-9b8c8685d-2snpd\" (UID: \"7fb0dcd0-9411-49d6-a997-79d2099b2462\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.121108 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-nmstate-lock\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.121126 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-dbus-socket\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.121142 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj6pt\" (UniqueName: \"kubernetes.io/projected/1b664fad-a0fa-4442-bed2-3316eafbb78c-kube-api-access-lj6pt\") pod \"nmstate-webhook-5f558f5558-xq48v\" (UID: \"1b664fad-a0fa-4442-bed2-3316eafbb78c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.127664 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1b664fad-a0fa-4442-bed2-3316eafbb78c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-xq48v\" (UID: \"1b664fad-a0fa-4442-bed2-3316eafbb78c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.143441 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj6pt\" (UniqueName: \"kubernetes.io/projected/1b664fad-a0fa-4442-bed2-3316eafbb78c-kube-api-access-lj6pt\") pod \"nmstate-webhook-5f558f5558-xq48v\" (UID: \"1b664fad-a0fa-4442-bed2-3316eafbb78c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.144129 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4fls\" (UniqueName: \"kubernetes.io/projected/7fb0dcd0-9411-49d6-a997-79d2099b2462-kube-api-access-m4fls\") pod \"nmstate-metrics-9b8c8685d-2snpd\" (UID: \"7fb0dcd0-9411-49d6-a997-79d2099b2462\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.213128 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.219770 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cd5b84cfd-qgdq4"] Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.220826 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.221685 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.222401 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a822f6ee-e723-4f64-b4f6-c948dc948359-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.222494 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a822f6ee-e723-4f64-b4f6-c948dc948359-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.222546 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-nmstate-lock\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.222579 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-dbus-socket\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.222633 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vclh8\" (UniqueName: \"kubernetes.io/projected/a822f6ee-e723-4f64-b4f6-c948dc948359-kube-api-access-vclh8\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.222686 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-ovs-socket\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.222722 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b958v\" (UniqueName: \"kubernetes.io/projected/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-kube-api-access-b958v\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.223303 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-nmstate-lock\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.223560 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-ovs-socket\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.223671 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-dbus-socket\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.236409 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd5b84cfd-qgdq4"] Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.274480 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b958v\" (UniqueName: \"kubernetes.io/projected/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-kube-api-access-b958v\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.324170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vclh8\" (UniqueName: \"kubernetes.io/projected/a822f6ee-e723-4f64-b4f6-c948dc948359-kube-api-access-vclh8\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.324773 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-trusted-ca-bundle\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.324807 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2faec9d1-9173-4181-b887-2a375426ff16-console-serving-cert\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.324827 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-service-ca\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.324914 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-console-config\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.324948 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a822f6ee-e723-4f64-b4f6-c948dc948359-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.324970 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2faec9d1-9173-4181-b887-2a375426ff16-console-oauth-config\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.325025 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a822f6ee-e723-4f64-b4f6-c948dc948359-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.325061 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrbr5\" (UniqueName: \"kubernetes.io/projected/2faec9d1-9173-4181-b887-2a375426ff16-kube-api-access-xrbr5\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.325115 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-oauth-serving-cert\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.327273 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a822f6ee-e723-4f64-b4f6-c948dc948359-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.336223 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a822f6ee-e723-4f64-b4f6-c948dc948359-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.348952 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vclh8\" (UniqueName: \"kubernetes.io/projected/a822f6ee-e723-4f64-b4f6-c948dc948359-kube-api-access-vclh8\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.364696 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.426486 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbr5\" (UniqueName: \"kubernetes.io/projected/2faec9d1-9173-4181-b887-2a375426ff16-kube-api-access-xrbr5\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.426553 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-oauth-serving-cert\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.426591 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-trusted-ca-bundle\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.426618 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2faec9d1-9173-4181-b887-2a375426ff16-console-serving-cert\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.426639 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-service-ca\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.426676 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-console-config\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.426701 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2faec9d1-9173-4181-b887-2a375426ff16-console-oauth-config\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.428864 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-service-ca\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.428935 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-console-config\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.429522 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-trusted-ca-bundle\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.430857 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-oauth-serving-cert\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.431568 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2faec9d1-9173-4181-b887-2a375426ff16-console-oauth-config\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.438702 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2faec9d1-9173-4181-b887-2a375426ff16-console-serving-cert\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.444351 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrbr5\" (UniqueName: \"kubernetes.io/projected/2faec9d1-9173-4181-b887-2a375426ff16-kube-api-access-xrbr5\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.514959 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.515016 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.545804 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.598022 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.616444 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-xq48v"] Mar 11 12:11:39 crc kubenswrapper[4816]: W0311 12:11:39.628060 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b664fad_a0fa_4442_bed2_3316eafbb78c.slice/crio-08d9d109ad0671f8142b1276a83b73e1da951251757dc5caf16e8cec867fd7ce WatchSource:0}: Error finding container 08d9d109ad0671f8142b1276a83b73e1da951251757dc5caf16e8cec867fd7ce: Status 404 returned error can't find the container with id 08d9d109ad0671f8142b1276a83b73e1da951251757dc5caf16e8cec867fd7ce Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.658083 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk"] Mar 11 12:11:39 crc kubenswrapper[4816]: W0311 12:11:39.663347 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda822f6ee_e723_4f64_b4f6_c948dc948359.slice/crio-80e76120d350df905ae611a86507796cc0bc43f29afe54356e1793a2471dfaa2 WatchSource:0}: Error finding container 80e76120d350df905ae611a86507796cc0bc43f29afe54356e1793a2471dfaa2: Status 404 returned error can't find the container with id 80e76120d350df905ae611a86507796cc0bc43f29afe54356e1793a2471dfaa2 Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.731977 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd"] Mar 11 12:11:39 crc kubenswrapper[4816]: W0311 12:11:39.734091 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fb0dcd0_9411_49d6_a997_79d2099b2462.slice/crio-6819cb7b52ed5b74b408ed83c83d8239d6c4295a20a0ad70529fb675bb94dbeb WatchSource:0}: Error finding container 6819cb7b52ed5b74b408ed83c83d8239d6c4295a20a0ad70529fb675bb94dbeb: Status 404 returned error can't find the container with id 6819cb7b52ed5b74b408ed83c83d8239d6c4295a20a0ad70529fb675bb94dbeb Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.741360 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" event={"ID":"a822f6ee-e723-4f64-b4f6-c948dc948359","Type":"ContainerStarted","Data":"80e76120d350df905ae611a86507796cc0bc43f29afe54356e1793a2471dfaa2"} Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.742568 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-47rs2" event={"ID":"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9","Type":"ContainerStarted","Data":"dc13bf644351194920c39fd044bd74e79de536885e38c34e70b730aa1bdb3a6c"} Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.743611 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" event={"ID":"1b664fad-a0fa-4442-bed2-3316eafbb78c","Type":"ContainerStarted","Data":"08d9d109ad0671f8142b1276a83b73e1da951251757dc5caf16e8cec867fd7ce"} Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.793281 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd5b84cfd-qgdq4"] Mar 11 12:11:39 crc kubenswrapper[4816]: W0311 12:11:39.802024 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2faec9d1_9173_4181_b887_2a375426ff16.slice/crio-067e7f4c96c162d9c49f51783a6d70a031538a88a941196258da04be5bd4d5dc WatchSource:0}: Error finding container 067e7f4c96c162d9c49f51783a6d70a031538a88a941196258da04be5bd4d5dc: Status 404 returned error can't find the container with id 067e7f4c96c162d9c49f51783a6d70a031538a88a941196258da04be5bd4d5dc Mar 11 12:11:40 crc kubenswrapper[4816]: I0311 12:11:40.750563 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" event={"ID":"7fb0dcd0-9411-49d6-a997-79d2099b2462","Type":"ContainerStarted","Data":"6819cb7b52ed5b74b408ed83c83d8239d6c4295a20a0ad70529fb675bb94dbeb"} Mar 11 12:11:40 crc kubenswrapper[4816]: I0311 12:11:40.753949 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd5b84cfd-qgdq4" event={"ID":"2faec9d1-9173-4181-b887-2a375426ff16","Type":"ContainerStarted","Data":"ee6d9f77d570cb657bb9b8a6ce31de1bb71f653dfece0faa1822af2c1f6108dc"} Mar 11 12:11:40 crc kubenswrapper[4816]: I0311 12:11:40.754027 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd5b84cfd-qgdq4" event={"ID":"2faec9d1-9173-4181-b887-2a375426ff16","Type":"ContainerStarted","Data":"067e7f4c96c162d9c49f51783a6d70a031538a88a941196258da04be5bd4d5dc"} Mar 11 12:11:40 crc kubenswrapper[4816]: I0311 12:11:40.775240 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cd5b84cfd-qgdq4" podStartSLOduration=1.775222847 podStartE2EDuration="1.775222847s" podCreationTimestamp="2026-03-11 12:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:11:40.774503527 +0000 UTC m=+787.365767494" watchObservedRunningTime="2026-03-11 12:11:40.775222847 +0000 UTC m=+787.366486814" Mar 11 12:11:41 crc kubenswrapper[4816]: I0311 12:11:41.287901 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:41 crc kubenswrapper[4816]: I0311 12:11:41.358796 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:41 crc kubenswrapper[4816]: I0311 12:11:41.519658 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c29dr"] Mar 11 12:11:42 crc kubenswrapper[4816]: I0311 12:11:42.770535 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c29dr" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="registry-server" containerID="cri-o://95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9" gracePeriod=2 Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.314421 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.489311 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-utilities\") pod \"461ff7ea-1256-4928-9425-ed9840dc4eda\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.489829 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-catalog-content\") pod \"461ff7ea-1256-4928-9425-ed9840dc4eda\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.490019 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqbqq\" (UniqueName: \"kubernetes.io/projected/461ff7ea-1256-4928-9425-ed9840dc4eda-kube-api-access-kqbqq\") pod \"461ff7ea-1256-4928-9425-ed9840dc4eda\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.490366 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-utilities" (OuterVolumeSpecName: "utilities") pod "461ff7ea-1256-4928-9425-ed9840dc4eda" (UID: "461ff7ea-1256-4928-9425-ed9840dc4eda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.496271 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/461ff7ea-1256-4928-9425-ed9840dc4eda-kube-api-access-kqbqq" (OuterVolumeSpecName: "kube-api-access-kqbqq") pod "461ff7ea-1256-4928-9425-ed9840dc4eda" (UID: "461ff7ea-1256-4928-9425-ed9840dc4eda"). InnerVolumeSpecName "kube-api-access-kqbqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.591368 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqbqq\" (UniqueName: \"kubernetes.io/projected/461ff7ea-1256-4928-9425-ed9840dc4eda-kube-api-access-kqbqq\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.592172 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.612332 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "461ff7ea-1256-4928-9425-ed9840dc4eda" (UID: "461ff7ea-1256-4928-9425-ed9840dc4eda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.693980 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.779137 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" event={"ID":"a822f6ee-e723-4f64-b4f6-c948dc948359","Type":"ContainerStarted","Data":"581df34bdea0a6240db949e261a44b234e0c12a9ea1a278bee32fb8c46da765e"} Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.782069 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" event={"ID":"7fb0dcd0-9411-49d6-a997-79d2099b2462","Type":"ContainerStarted","Data":"1b2ce29c545c5e80c863bcc3ec847584bc5a65bfc8d54215b0f0739a53d5e3e1"} Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.785287 4816 generic.go:334] "Generic (PLEG): container finished" podID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerID="95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9" exitCode=0 Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.785382 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.785516 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c29dr" event={"ID":"461ff7ea-1256-4928-9425-ed9840dc4eda","Type":"ContainerDied","Data":"95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9"} Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.785740 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c29dr" event={"ID":"461ff7ea-1256-4928-9425-ed9840dc4eda","Type":"ContainerDied","Data":"f55816d844fceb14f1b2e972a2c499707de36af2a78a8f7ba4306a161a99924b"} Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.785781 4816 scope.go:117] "RemoveContainer" containerID="95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.787548 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-47rs2" event={"ID":"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9","Type":"ContainerStarted","Data":"ab21c0dd78bbf18d58501846c858efb5e3c52c65baf38bbd2a2a28708397c4cf"} Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.787740 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.791764 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" event={"ID":"1b664fad-a0fa-4442-bed2-3316eafbb78c","Type":"ContainerStarted","Data":"0acffc922382d3b43195460b16c202cccbefd81ad991cb4fa6c1422bdd5412eb"} Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.792587 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.802113 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" podStartSLOduration=1.334577114 podStartE2EDuration="4.802096226s" podCreationTimestamp="2026-03-11 12:11:39 +0000 UTC" firstStartedPulling="2026-03-11 12:11:39.666070001 +0000 UTC m=+786.257333968" lastFinishedPulling="2026-03-11 12:11:43.133589103 +0000 UTC m=+789.724853080" observedRunningTime="2026-03-11 12:11:43.799962356 +0000 UTC m=+790.391226313" watchObservedRunningTime="2026-03-11 12:11:43.802096226 +0000 UTC m=+790.393360203" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.819041 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-47rs2" podStartSLOduration=2.219522233 podStartE2EDuration="5.819019358s" podCreationTimestamp="2026-03-11 12:11:38 +0000 UTC" firstStartedPulling="2026-03-11 12:11:39.568597219 +0000 UTC m=+786.159861186" lastFinishedPulling="2026-03-11 12:11:43.168094334 +0000 UTC m=+789.759358311" observedRunningTime="2026-03-11 12:11:43.818487603 +0000 UTC m=+790.409751570" watchObservedRunningTime="2026-03-11 12:11:43.819019358 +0000 UTC m=+790.410283325" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.824445 4816 scope.go:117] "RemoveContainer" containerID="4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.843863 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" podStartSLOduration=2.340690259 podStartE2EDuration="5.843844874s" podCreationTimestamp="2026-03-11 12:11:38 +0000 UTC" firstStartedPulling="2026-03-11 12:11:39.631761515 +0000 UTC m=+786.223025482" lastFinishedPulling="2026-03-11 12:11:43.13491613 +0000 UTC m=+789.726180097" observedRunningTime="2026-03-11 12:11:43.83809101 +0000 UTC m=+790.429354977" watchObservedRunningTime="2026-03-11 12:11:43.843844874 +0000 UTC m=+790.435108841" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.852760 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c29dr"] Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.860033 4816 scope.go:117] "RemoveContainer" containerID="003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.873225 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c29dr"] Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.881049 4816 scope.go:117] "RemoveContainer" containerID="95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9" Mar 11 12:11:43 crc kubenswrapper[4816]: E0311 12:11:43.881624 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9\": container with ID starting with 95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9 not found: ID does not exist" containerID="95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.881663 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9"} err="failed to get container status \"95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9\": rpc error: code = NotFound desc = could not find container \"95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9\": container with ID starting with 95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9 not found: ID does not exist" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.881694 4816 scope.go:117] "RemoveContainer" containerID="4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a" Mar 11 12:11:43 crc kubenswrapper[4816]: E0311 12:11:43.882101 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a\": container with ID starting with 4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a not found: ID does not exist" containerID="4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.882148 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a"} err="failed to get container status \"4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a\": rpc error: code = NotFound desc = could not find container \"4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a\": container with ID starting with 4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a not found: ID does not exist" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.882174 4816 scope.go:117] "RemoveContainer" containerID="003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3" Mar 11 12:11:43 crc kubenswrapper[4816]: E0311 12:11:43.882636 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3\": container with ID starting with 003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3 not found: ID does not exist" containerID="003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.882661 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3"} err="failed to get container status \"003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3\": rpc error: code = NotFound desc = could not find container \"003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3\": container with ID starting with 003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3 not found: ID does not exist" Mar 11 12:11:44 crc kubenswrapper[4816]: I0311 12:11:44.153060 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" path="/var/lib/kubelet/pods/461ff7ea-1256-4928-9425-ed9840dc4eda/volumes" Mar 11 12:11:46 crc kubenswrapper[4816]: I0311 12:11:46.817216 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" event={"ID":"7fb0dcd0-9411-49d6-a997-79d2099b2462","Type":"ContainerStarted","Data":"ec87d8bb7f0f902050146f6c79fc3f469dea06afd48547bc35ede11e2e3f4512"} Mar 11 12:11:46 crc kubenswrapper[4816]: I0311 12:11:46.866498 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" podStartSLOduration=2.221221 podStartE2EDuration="8.866452081s" podCreationTimestamp="2026-03-11 12:11:38 +0000 UTC" firstStartedPulling="2026-03-11 12:11:39.735388852 +0000 UTC m=+786.326652819" lastFinishedPulling="2026-03-11 12:11:46.380619933 +0000 UTC m=+792.971883900" observedRunningTime="2026-03-11 12:11:46.843270812 +0000 UTC m=+793.434534779" watchObservedRunningTime="2026-03-11 12:11:46.866452081 +0000 UTC m=+793.457716098" Mar 11 12:11:49 crc kubenswrapper[4816]: I0311 12:11:49.582984 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:49 crc kubenswrapper[4816]: I0311 12:11:49.600636 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:49 crc kubenswrapper[4816]: I0311 12:11:49.600750 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:49 crc kubenswrapper[4816]: I0311 12:11:49.610456 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:49 crc kubenswrapper[4816]: I0311 12:11:49.847889 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:49 crc kubenswrapper[4816]: I0311 12:11:49.925059 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-blgl4"] Mar 11 12:11:59 crc kubenswrapper[4816]: I0311 12:11:59.232631 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.127851 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553852-tvtrs"] Mar 11 12:12:00 crc kubenswrapper[4816]: E0311 12:12:00.128110 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="extract-utilities" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.128123 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="extract-utilities" Mar 11 12:12:00 crc kubenswrapper[4816]: E0311 12:12:00.128140 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="registry-server" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.128146 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="registry-server" Mar 11 12:12:00 crc kubenswrapper[4816]: E0311 12:12:00.128154 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="extract-content" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.128161 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="extract-content" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.128285 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="registry-server" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.128683 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553852-tvtrs" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.131566 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.131739 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.135550 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xbn9\" (UniqueName: \"kubernetes.io/projected/f1d15245-e206-4f60-a05c-9888a45a1aca-kube-api-access-6xbn9\") pod \"auto-csr-approver-29553852-tvtrs\" (UID: \"f1d15245-e206-4f60-a05c-9888a45a1aca\") " pod="openshift-infra/auto-csr-approver-29553852-tvtrs" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.135593 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.150173 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553852-tvtrs"] Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.237239 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xbn9\" (UniqueName: \"kubernetes.io/projected/f1d15245-e206-4f60-a05c-9888a45a1aca-kube-api-access-6xbn9\") pod \"auto-csr-approver-29553852-tvtrs\" (UID: \"f1d15245-e206-4f60-a05c-9888a45a1aca\") " pod="openshift-infra/auto-csr-approver-29553852-tvtrs" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.260381 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xbn9\" (UniqueName: \"kubernetes.io/projected/f1d15245-e206-4f60-a05c-9888a45a1aca-kube-api-access-6xbn9\") pod \"auto-csr-approver-29553852-tvtrs\" (UID: \"f1d15245-e206-4f60-a05c-9888a45a1aca\") " pod="openshift-infra/auto-csr-approver-29553852-tvtrs" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.447901 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553852-tvtrs" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.655771 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553852-tvtrs"] Mar 11 12:12:00 crc kubenswrapper[4816]: W0311 12:12:00.662079 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1d15245_e206_4f60_a05c_9888a45a1aca.slice/crio-9fd0366cac3cfbf4773d8fb37507e0ae9de9dd8d399a8112b07e635776aac636 WatchSource:0}: Error finding container 9fd0366cac3cfbf4773d8fb37507e0ae9de9dd8d399a8112b07e635776aac636: Status 404 returned error can't find the container with id 9fd0366cac3cfbf4773d8fb37507e0ae9de9dd8d399a8112b07e635776aac636 Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.919136 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553852-tvtrs" event={"ID":"f1d15245-e206-4f60-a05c-9888a45a1aca","Type":"ContainerStarted","Data":"9fd0366cac3cfbf4773d8fb37507e0ae9de9dd8d399a8112b07e635776aac636"} Mar 11 12:12:02 crc kubenswrapper[4816]: I0311 12:12:02.931666 4816 generic.go:334] "Generic (PLEG): container finished" podID="f1d15245-e206-4f60-a05c-9888a45a1aca" containerID="258e8c83fc2dd9e9c165c147a3085d310c4de5d771038f237098b4b3a09178a8" exitCode=0 Mar 11 12:12:02 crc kubenswrapper[4816]: I0311 12:12:02.931814 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553852-tvtrs" event={"ID":"f1d15245-e206-4f60-a05c-9888a45a1aca","Type":"ContainerDied","Data":"258e8c83fc2dd9e9c165c147a3085d310c4de5d771038f237098b4b3a09178a8"} Mar 11 12:12:04 crc kubenswrapper[4816]: I0311 12:12:04.159872 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553852-tvtrs" Mar 11 12:12:04 crc kubenswrapper[4816]: I0311 12:12:04.294184 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xbn9\" (UniqueName: \"kubernetes.io/projected/f1d15245-e206-4f60-a05c-9888a45a1aca-kube-api-access-6xbn9\") pod \"f1d15245-e206-4f60-a05c-9888a45a1aca\" (UID: \"f1d15245-e206-4f60-a05c-9888a45a1aca\") " Mar 11 12:12:04 crc kubenswrapper[4816]: I0311 12:12:04.299585 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1d15245-e206-4f60-a05c-9888a45a1aca-kube-api-access-6xbn9" (OuterVolumeSpecName: "kube-api-access-6xbn9") pod "f1d15245-e206-4f60-a05c-9888a45a1aca" (UID: "f1d15245-e206-4f60-a05c-9888a45a1aca"). InnerVolumeSpecName "kube-api-access-6xbn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:12:04 crc kubenswrapper[4816]: I0311 12:12:04.396772 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xbn9\" (UniqueName: \"kubernetes.io/projected/f1d15245-e206-4f60-a05c-9888a45a1aca-kube-api-access-6xbn9\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:04 crc kubenswrapper[4816]: I0311 12:12:04.947221 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553852-tvtrs" event={"ID":"f1d15245-e206-4f60-a05c-9888a45a1aca","Type":"ContainerDied","Data":"9fd0366cac3cfbf4773d8fb37507e0ae9de9dd8d399a8112b07e635776aac636"} Mar 11 12:12:04 crc kubenswrapper[4816]: I0311 12:12:04.947315 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fd0366cac3cfbf4773d8fb37507e0ae9de9dd8d399a8112b07e635776aac636" Mar 11 12:12:04 crc kubenswrapper[4816]: I0311 12:12:04.947381 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553852-tvtrs" Mar 11 12:12:05 crc kubenswrapper[4816]: I0311 12:12:05.234216 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553846-v5vlq"] Mar 11 12:12:05 crc kubenswrapper[4816]: I0311 12:12:05.241299 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553846-v5vlq"] Mar 11 12:12:06 crc kubenswrapper[4816]: I0311 12:12:06.142194 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8860b8d2-719a-4930-9df3-d0bc14d8de19" path="/var/lib/kubelet/pods/8860b8d2-719a-4930-9df3-d0bc14d8de19/volumes" Mar 11 12:12:09 crc kubenswrapper[4816]: I0311 12:12:09.515391 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:12:09 crc kubenswrapper[4816]: I0311 12:12:09.516550 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.090171 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8"] Mar 11 12:12:12 crc kubenswrapper[4816]: E0311 12:12:12.091678 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1d15245-e206-4f60-a05c-9888a45a1aca" containerName="oc" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.091698 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d15245-e206-4f60-a05c-9888a45a1aca" containerName="oc" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.091844 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1d15245-e206-4f60-a05c-9888a45a1aca" containerName="oc" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.093169 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.097728 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.101018 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8"] Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.111951 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.111997 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dtbn\" (UniqueName: \"kubernetes.io/projected/5dff60f3-3acf-4dfb-9098-917736f61c0c-kube-api-access-4dtbn\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.112041 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.212940 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dtbn\" (UniqueName: \"kubernetes.io/projected/5dff60f3-3acf-4dfb-9098-917736f61c0c-kube-api-access-4dtbn\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.213019 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.213145 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.213896 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.213917 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.233583 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dtbn\" (UniqueName: \"kubernetes.io/projected/5dff60f3-3acf-4dfb-9098-917736f61c0c-kube-api-access-4dtbn\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.415874 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.649112 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8"] Mar 11 12:12:13 crc kubenswrapper[4816]: I0311 12:12:13.007080 4816 generic.go:334] "Generic (PLEG): container finished" podID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerID="82302b4cf683ff043cbae0c1b55f06bd328dde371bf2d875c664d24a7167590a" exitCode=0 Mar 11 12:12:13 crc kubenswrapper[4816]: I0311 12:12:13.007146 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" event={"ID":"5dff60f3-3acf-4dfb-9098-917736f61c0c","Type":"ContainerDied","Data":"82302b4cf683ff043cbae0c1b55f06bd328dde371bf2d875c664d24a7167590a"} Mar 11 12:12:13 crc kubenswrapper[4816]: I0311 12:12:13.007206 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" event={"ID":"5dff60f3-3acf-4dfb-9098-917736f61c0c","Type":"ContainerStarted","Data":"f829741e1ad8b32117f692ad3039b9990bd12ea53fdc942a98b351dfc5dfabe3"} Mar 11 12:12:14 crc kubenswrapper[4816]: I0311 12:12:14.979401 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-blgl4" podUID="efc988f7-8a1a-4d22-b6bb-b2617c721017" containerName="console" containerID="cri-o://95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec" gracePeriod=15 Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.426855 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-blgl4_efc988f7-8a1a-4d22-b6bb-b2617c721017/console/0.log" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.427549 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.471441 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nngrf\" (UniqueName: \"kubernetes.io/projected/efc988f7-8a1a-4d22-b6bb-b2617c721017-kube-api-access-nngrf\") pod \"efc988f7-8a1a-4d22-b6bb-b2617c721017\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.471820 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-oauth-serving-cert\") pod \"efc988f7-8a1a-4d22-b6bb-b2617c721017\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.472002 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-oauth-config\") pod \"efc988f7-8a1a-4d22-b6bb-b2617c721017\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.472109 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-service-ca\") pod \"efc988f7-8a1a-4d22-b6bb-b2617c721017\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.472278 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-serving-cert\") pod \"efc988f7-8a1a-4d22-b6bb-b2617c721017\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.472379 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-config\") pod \"efc988f7-8a1a-4d22-b6bb-b2617c721017\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.472879 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "efc988f7-8a1a-4d22-b6bb-b2617c721017" (UID: "efc988f7-8a1a-4d22-b6bb-b2617c721017"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.473113 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-config" (OuterVolumeSpecName: "console-config") pod "efc988f7-8a1a-4d22-b6bb-b2617c721017" (UID: "efc988f7-8a1a-4d22-b6bb-b2617c721017"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.473189 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-service-ca" (OuterVolumeSpecName: "service-ca") pod "efc988f7-8a1a-4d22-b6bb-b2617c721017" (UID: "efc988f7-8a1a-4d22-b6bb-b2617c721017"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.473378 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-trusted-ca-bundle\") pod \"efc988f7-8a1a-4d22-b6bb-b2617c721017\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.473858 4816 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.473877 4816 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.473886 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.473984 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "efc988f7-8a1a-4d22-b6bb-b2617c721017" (UID: "efc988f7-8a1a-4d22-b6bb-b2617c721017"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.480147 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "efc988f7-8a1a-4d22-b6bb-b2617c721017" (UID: "efc988f7-8a1a-4d22-b6bb-b2617c721017"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.481389 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "efc988f7-8a1a-4d22-b6bb-b2617c721017" (UID: "efc988f7-8a1a-4d22-b6bb-b2617c721017"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.481731 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc988f7-8a1a-4d22-b6bb-b2617c721017-kube-api-access-nngrf" (OuterVolumeSpecName: "kube-api-access-nngrf") pod "efc988f7-8a1a-4d22-b6bb-b2617c721017" (UID: "efc988f7-8a1a-4d22-b6bb-b2617c721017"). InnerVolumeSpecName "kube-api-access-nngrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.576212 4816 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.576651 4816 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.576784 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.577094 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nngrf\" (UniqueName: \"kubernetes.io/projected/efc988f7-8a1a-4d22-b6bb-b2617c721017-kube-api-access-nngrf\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.033462 4816 generic.go:334] "Generic (PLEG): container finished" podID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerID="31e55dad710745ed804d06e0331068676e568a5f49ffe62b8d7bae286d8e48a8" exitCode=0 Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.033605 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" event={"ID":"5dff60f3-3acf-4dfb-9098-917736f61c0c","Type":"ContainerDied","Data":"31e55dad710745ed804d06e0331068676e568a5f49ffe62b8d7bae286d8e48a8"} Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.038812 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-blgl4_efc988f7-8a1a-4d22-b6bb-b2617c721017/console/0.log" Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.039630 4816 generic.go:334] "Generic (PLEG): container finished" podID="efc988f7-8a1a-4d22-b6bb-b2617c721017" containerID="95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec" exitCode=2 Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.039705 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-blgl4" event={"ID":"efc988f7-8a1a-4d22-b6bb-b2617c721017","Type":"ContainerDied","Data":"95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec"} Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.039765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-blgl4" event={"ID":"efc988f7-8a1a-4d22-b6bb-b2617c721017","Type":"ContainerDied","Data":"59a99708271969fdd60bd64b8768b6f0fa05af801e0f7d034beaae8d3d4be471"} Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.039804 4816 scope.go:117] "RemoveContainer" containerID="95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec" Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.040871 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.076630 4816 scope.go:117] "RemoveContainer" containerID="95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec" Mar 11 12:12:16 crc kubenswrapper[4816]: E0311 12:12:16.077628 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec\": container with ID starting with 95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec not found: ID does not exist" containerID="95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec" Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.077731 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec"} err="failed to get container status \"95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec\": rpc error: code = NotFound desc = could not find container \"95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec\": container with ID starting with 95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec not found: ID does not exist" Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.159846 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-blgl4"] Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.167899 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-blgl4"] Mar 11 12:12:17 crc kubenswrapper[4816]: I0311 12:12:17.050192 4816 generic.go:334] "Generic (PLEG): container finished" podID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerID="52eac97111b4b75a9861e04ef8afb3cfe3ad6fa5a574b7b896f5fa95a5a9fb59" exitCode=0 Mar 11 12:12:17 crc kubenswrapper[4816]: I0311 12:12:17.050328 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" event={"ID":"5dff60f3-3acf-4dfb-9098-917736f61c0c","Type":"ContainerDied","Data":"52eac97111b4b75a9861e04ef8afb3cfe3ad6fa5a574b7b896f5fa95a5a9fb59"} Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.139746 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc988f7-8a1a-4d22-b6bb-b2617c721017" path="/var/lib/kubelet/pods/efc988f7-8a1a-4d22-b6bb-b2617c721017/volumes" Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.348634 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.421758 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dtbn\" (UniqueName: \"kubernetes.io/projected/5dff60f3-3acf-4dfb-9098-917736f61c0c-kube-api-access-4dtbn\") pod \"5dff60f3-3acf-4dfb-9098-917736f61c0c\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.422124 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-util\") pod \"5dff60f3-3acf-4dfb-9098-917736f61c0c\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.422274 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-bundle\") pod \"5dff60f3-3acf-4dfb-9098-917736f61c0c\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.423308 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-bundle" (OuterVolumeSpecName: "bundle") pod "5dff60f3-3acf-4dfb-9098-917736f61c0c" (UID: "5dff60f3-3acf-4dfb-9098-917736f61c0c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.428605 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dff60f3-3acf-4dfb-9098-917736f61c0c-kube-api-access-4dtbn" (OuterVolumeSpecName: "kube-api-access-4dtbn") pod "5dff60f3-3acf-4dfb-9098-917736f61c0c" (UID: "5dff60f3-3acf-4dfb-9098-917736f61c0c"). InnerVolumeSpecName "kube-api-access-4dtbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.533500 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.533570 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dtbn\" (UniqueName: \"kubernetes.io/projected/5dff60f3-3acf-4dfb-9098-917736f61c0c-kube-api-access-4dtbn\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.594515 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-util" (OuterVolumeSpecName: "util") pod "5dff60f3-3acf-4dfb-9098-917736f61c0c" (UID: "5dff60f3-3acf-4dfb-9098-917736f61c0c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.634860 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-util\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:19 crc kubenswrapper[4816]: I0311 12:12:19.070618 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" event={"ID":"5dff60f3-3acf-4dfb-9098-917736f61c0c","Type":"ContainerDied","Data":"f829741e1ad8b32117f692ad3039b9990bd12ea53fdc942a98b351dfc5dfabe3"} Mar 11 12:12:19 crc kubenswrapper[4816]: I0311 12:12:19.070676 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f829741e1ad8b32117f692ad3039b9990bd12ea53fdc942a98b351dfc5dfabe3" Mar 11 12:12:19 crc kubenswrapper[4816]: I0311 12:12:19.070738 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.585148 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5679b59769-8stwg"] Mar 11 12:12:27 crc kubenswrapper[4816]: E0311 12:12:27.585939 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerName="pull" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.585956 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerName="pull" Mar 11 12:12:27 crc kubenswrapper[4816]: E0311 12:12:27.585972 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc988f7-8a1a-4d22-b6bb-b2617c721017" containerName="console" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.585980 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc988f7-8a1a-4d22-b6bb-b2617c721017" containerName="console" Mar 11 12:12:27 crc kubenswrapper[4816]: E0311 12:12:27.585994 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerName="util" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.586001 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerName="util" Mar 11 12:12:27 crc kubenswrapper[4816]: E0311 12:12:27.586023 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerName="extract" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.586030 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerName="extract" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.586133 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc988f7-8a1a-4d22-b6bb-b2617c721017" containerName="console" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.586150 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerName="extract" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.586635 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.589096 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-zs7v5" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.589238 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.589499 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.589765 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.590053 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.642630 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5679b59769-8stwg"] Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.652836 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-apiservice-cert\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.652930 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztzjg\" (UniqueName: \"kubernetes.io/projected/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-kube-api-access-ztzjg\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.652975 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-webhook-cert\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.754621 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztzjg\" (UniqueName: \"kubernetes.io/projected/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-kube-api-access-ztzjg\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.754990 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-webhook-cert\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.755172 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-apiservice-cert\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.761369 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-webhook-cert\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.761789 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-apiservice-cert\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.774574 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztzjg\" (UniqueName: \"kubernetes.io/projected/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-kube-api-access-ztzjg\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.828775 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz"] Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.829491 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.833349 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-wwfq8" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.833545 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.834094 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.848966 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz"] Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.856870 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72342d10-d8c0-4f04-9554-e57c84d77653-webhook-cert\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.857227 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qfn8\" (UniqueName: \"kubernetes.io/projected/72342d10-d8c0-4f04-9554-e57c84d77653-kube-api-access-8qfn8\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.857372 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72342d10-d8c0-4f04-9554-e57c84d77653-apiservice-cert\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.903121 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.959066 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72342d10-d8c0-4f04-9554-e57c84d77653-webhook-cert\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.959125 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qfn8\" (UniqueName: \"kubernetes.io/projected/72342d10-d8c0-4f04-9554-e57c84d77653-kube-api-access-8qfn8\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.959211 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72342d10-d8c0-4f04-9554-e57c84d77653-apiservice-cert\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.963701 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72342d10-d8c0-4f04-9554-e57c84d77653-webhook-cert\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.963744 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72342d10-d8c0-4f04-9554-e57c84d77653-apiservice-cert\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.985241 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qfn8\" (UniqueName: \"kubernetes.io/projected/72342d10-d8c0-4f04-9554-e57c84d77653-kube-api-access-8qfn8\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:28 crc kubenswrapper[4816]: I0311 12:12:28.147516 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:28 crc kubenswrapper[4816]: I0311 12:12:28.386731 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5679b59769-8stwg"] Mar 11 12:12:28 crc kubenswrapper[4816]: I0311 12:12:28.632976 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz"] Mar 11 12:12:28 crc kubenswrapper[4816]: W0311 12:12:28.643696 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72342d10_d8c0_4f04_9554_e57c84d77653.slice/crio-08441a46d180046d808092e32b22e396e5fce174c3ba4335a9fa5019f5267b82 WatchSource:0}: Error finding container 08441a46d180046d808092e32b22e396e5fce174c3ba4335a9fa5019f5267b82: Status 404 returned error can't find the container with id 08441a46d180046d808092e32b22e396e5fce174c3ba4335a9fa5019f5267b82 Mar 11 12:12:29 crc kubenswrapper[4816]: I0311 12:12:29.136519 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" event={"ID":"72342d10-d8c0-4f04-9554-e57c84d77653","Type":"ContainerStarted","Data":"08441a46d180046d808092e32b22e396e5fce174c3ba4335a9fa5019f5267b82"} Mar 11 12:12:29 crc kubenswrapper[4816]: I0311 12:12:29.137633 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" event={"ID":"7f7c9c4d-3a3f-4524-8964-8a99f24c2786","Type":"ContainerStarted","Data":"a11cf9aaf1ecddcd77fb2049cfeb363445447d59990dab89091820bec520e515"} Mar 11 12:12:34 crc kubenswrapper[4816]: I0311 12:12:34.194896 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" event={"ID":"72342d10-d8c0-4f04-9554-e57c84d77653","Type":"ContainerStarted","Data":"89891833c721f256f8d2cd36bd0263ece32c7b4d3a6c7b3f104aae41659ecc08"} Mar 11 12:12:34 crc kubenswrapper[4816]: I0311 12:12:34.196426 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:34 crc kubenswrapper[4816]: I0311 12:12:34.199539 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" event={"ID":"7f7c9c4d-3a3f-4524-8964-8a99f24c2786","Type":"ContainerStarted","Data":"ae00879262f70582385d8c1328565352a267e8e8c14456ddb3b3a385b4befb00"} Mar 11 12:12:34 crc kubenswrapper[4816]: I0311 12:12:34.200294 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:34 crc kubenswrapper[4816]: I0311 12:12:34.226357 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" podStartSLOduration=2.203498951 podStartE2EDuration="7.226338289s" podCreationTimestamp="2026-03-11 12:12:27 +0000 UTC" firstStartedPulling="2026-03-11 12:12:28.645729607 +0000 UTC m=+835.236993564" lastFinishedPulling="2026-03-11 12:12:33.668568935 +0000 UTC m=+840.259832902" observedRunningTime="2026-03-11 12:12:34.223661012 +0000 UTC m=+840.814924979" watchObservedRunningTime="2026-03-11 12:12:34.226338289 +0000 UTC m=+840.817602266" Mar 11 12:12:34 crc kubenswrapper[4816]: I0311 12:12:34.261397 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" podStartSLOduration=2.00094873 podStartE2EDuration="7.261365815s" podCreationTimestamp="2026-03-11 12:12:27 +0000 UTC" firstStartedPulling="2026-03-11 12:12:28.406585995 +0000 UTC m=+834.997849962" lastFinishedPulling="2026-03-11 12:12:33.66700307 +0000 UTC m=+840.258267047" observedRunningTime="2026-03-11 12:12:34.260706986 +0000 UTC m=+840.851970953" watchObservedRunningTime="2026-03-11 12:12:34.261365815 +0000 UTC m=+840.852629782" Mar 11 12:12:34 crc kubenswrapper[4816]: I0311 12:12:34.583239 4816 scope.go:117] "RemoveContainer" containerID="587884e89c7feed672cb66139bc979bafcbc72560d3687797e97ca922f238ebb" Mar 11 12:12:39 crc kubenswrapper[4816]: I0311 12:12:39.515638 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:12:39 crc kubenswrapper[4816]: I0311 12:12:39.516627 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:12:39 crc kubenswrapper[4816]: I0311 12:12:39.516708 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:12:39 crc kubenswrapper[4816]: I0311 12:12:39.517602 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45ccbed932001dc629a77de7e08e04a9cce25a78ac1e00aed407f7f4e1fa93a3"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:12:39 crc kubenswrapper[4816]: I0311 12:12:39.517666 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://45ccbed932001dc629a77de7e08e04a9cce25a78ac1e00aed407f7f4e1fa93a3" gracePeriod=600 Mar 11 12:12:40 crc kubenswrapper[4816]: I0311 12:12:40.255268 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="45ccbed932001dc629a77de7e08e04a9cce25a78ac1e00aed407f7f4e1fa93a3" exitCode=0 Mar 11 12:12:40 crc kubenswrapper[4816]: I0311 12:12:40.255357 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"45ccbed932001dc629a77de7e08e04a9cce25a78ac1e00aed407f7f4e1fa93a3"} Mar 11 12:12:40 crc kubenswrapper[4816]: I0311 12:12:40.255588 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"13e7eed3f44dcb7bba59d21f6a1bb4bc9f4b869b7a25106a79ff8ceef1b9e507"} Mar 11 12:12:40 crc kubenswrapper[4816]: I0311 12:12:40.255620 4816 scope.go:117] "RemoveContainer" containerID="233fffa5de6ee1e762a8824b32dec71fe3b7403332cc2d914d3770d768c1fbca" Mar 11 12:12:48 crc kubenswrapper[4816]: I0311 12:12:48.160633 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:13:07 crc kubenswrapper[4816]: I0311 12:13:07.906981 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.622625 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg"] Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.623438 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.625259 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.625617 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-q7hhh" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.633551 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-bjfwg"] Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.636528 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.637382 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg"] Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.639372 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.639638 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672201 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics-certs\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672625 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-reloader\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672660 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6512814f-09cf-4b97-a1d6-ec99bcbf1525-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-h8scg\" (UID: \"6512814f-09cf-4b97-a1d6-ec99bcbf1525\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672690 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-startup\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672719 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt4b7\" (UniqueName: \"kubernetes.io/projected/6512814f-09cf-4b97-a1d6-ec99bcbf1525-kube-api-access-vt4b7\") pod \"frr-k8s-webhook-server-bcc4b6f68-h8scg\" (UID: \"6512814f-09cf-4b97-a1d6-ec99bcbf1525\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672752 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-conf\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672787 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djhgf\" (UniqueName: \"kubernetes.io/projected/00616041-f382-4b2a-a7ef-b75a14621ce1-kube-api-access-djhgf\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672808 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672826 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-sockets\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.714783 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-wqwrt"] Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.716042 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.718044 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-lzmwq" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.718279 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.718611 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.719355 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.751420 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-srnjf"] Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.752367 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.755260 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.765556 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-srnjf"] Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.773848 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-conf\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.773903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djhgf\" (UniqueName: \"kubernetes.io/projected/00616041-f382-4b2a-a7ef-b75a14621ce1-kube-api-access-djhgf\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.773929 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metrics-certs\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.773946 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-cert\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.773973 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.773991 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-sockets\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774012 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llnj7\" (UniqueName: \"kubernetes.io/projected/2af0656a-169d-42fe-8efb-5258bc56af56-kube-api-access-llnj7\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774045 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics-certs\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774064 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-reloader\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774091 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774113 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6512814f-09cf-4b97-a1d6-ec99bcbf1525-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-h8scg\" (UID: \"6512814f-09cf-4b97-a1d6-ec99bcbf1525\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774131 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-startup\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774152 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metallb-excludel2\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774174 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4b7\" (UniqueName: \"kubernetes.io/projected/6512814f-09cf-4b97-a1d6-ec99bcbf1525-kube-api-access-vt4b7\") pod \"frr-k8s-webhook-server-bcc4b6f68-h8scg\" (UID: \"6512814f-09cf-4b97-a1d6-ec99bcbf1525\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774195 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-metrics-certs\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774214 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwcv9\" (UniqueName: \"kubernetes.io/projected/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-kube-api-access-cwcv9\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774681 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-conf\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.775129 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.775343 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-sockets\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.775426 4816 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.775472 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics-certs podName:00616041-f382-4b2a-a7ef-b75a14621ce1 nodeName:}" failed. No retries permitted until 2026-03-11 12:13:09.275455058 +0000 UTC m=+875.866719025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics-certs") pod "frr-k8s-bjfwg" (UID: "00616041-f382-4b2a-a7ef-b75a14621ce1") : secret "frr-k8s-certs-secret" not found Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.775650 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-reloader\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.782499 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-startup\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.797168 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6512814f-09cf-4b97-a1d6-ec99bcbf1525-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-h8scg\" (UID: \"6512814f-09cf-4b97-a1d6-ec99bcbf1525\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.800415 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt4b7\" (UniqueName: \"kubernetes.io/projected/6512814f-09cf-4b97-a1d6-ec99bcbf1525-kube-api-access-vt4b7\") pod \"frr-k8s-webhook-server-bcc4b6f68-h8scg\" (UID: \"6512814f-09cf-4b97-a1d6-ec99bcbf1525\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.804008 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djhgf\" (UniqueName: \"kubernetes.io/projected/00616041-f382-4b2a-a7ef-b75a14621ce1-kube-api-access-djhgf\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.875547 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metrics-certs\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.875594 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-cert\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.875630 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llnj7\" (UniqueName: \"kubernetes.io/projected/2af0656a-169d-42fe-8efb-5258bc56af56-kube-api-access-llnj7\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.875719 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.875752 4816 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.875840 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metrics-certs podName:43ec0f0d-8425-4dc4-9aa2-f1f85a26548c nodeName:}" failed. No retries permitted until 2026-03-11 12:13:09.375817551 +0000 UTC m=+875.967081518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metrics-certs") pod "speaker-wqwrt" (UID: "43ec0f0d-8425-4dc4-9aa2-f1f85a26548c") : secret "speaker-certs-secret" not found Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.875995 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metallb-excludel2\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.876029 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwcv9\" (UniqueName: \"kubernetes.io/projected/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-kube-api-access-cwcv9\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.876046 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-metrics-certs\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.876060 4816 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.876182 4816 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.876188 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist podName:43ec0f0d-8425-4dc4-9aa2-f1f85a26548c nodeName:}" failed. No retries permitted until 2026-03-11 12:13:09.376154061 +0000 UTC m=+875.967418028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist") pod "speaker-wqwrt" (UID: "43ec0f0d-8425-4dc4-9aa2-f1f85a26548c") : secret "metallb-memberlist" not found Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.876292 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-metrics-certs podName:2af0656a-169d-42fe-8efb-5258bc56af56 nodeName:}" failed. No retries permitted until 2026-03-11 12:13:09.376276604 +0000 UTC m=+875.967540571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-metrics-certs") pod "controller-7bb4cc7c98-srnjf" (UID: "2af0656a-169d-42fe-8efb-5258bc56af56") : secret "controller-certs-secret" not found Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.876895 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metallb-excludel2\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.878554 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.890685 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-cert\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.896676 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwcv9\" (UniqueName: \"kubernetes.io/projected/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-kube-api-access-cwcv9\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.908918 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llnj7\" (UniqueName: \"kubernetes.io/projected/2af0656a-169d-42fe-8efb-5258bc56af56-kube-api-access-llnj7\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.959384 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.221654 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg"] Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.283737 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics-certs\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.288631 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics-certs\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.384906 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metrics-certs\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.384983 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.385011 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-metrics-certs\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:09 crc kubenswrapper[4816]: E0311 12:13:09.385580 4816 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 11 12:13:09 crc kubenswrapper[4816]: E0311 12:13:09.385694 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist podName:43ec0f0d-8425-4dc4-9aa2-f1f85a26548c nodeName:}" failed. No retries permitted until 2026-03-11 12:13:10.385671141 +0000 UTC m=+876.976935118 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist") pod "speaker-wqwrt" (UID: "43ec0f0d-8425-4dc4-9aa2-f1f85a26548c") : secret "metallb-memberlist" not found Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.388808 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-metrics-certs\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.389145 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metrics-certs\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.462746 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" event={"ID":"6512814f-09cf-4b97-a1d6-ec99bcbf1525","Type":"ContainerStarted","Data":"17f73d8496f7bee3e2dcc587cb93e188b62354803564e1f3f60b89434f9e3441"} Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.568944 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.667400 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.937743 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-srnjf"] Mar 11 12:13:09 crc kubenswrapper[4816]: W0311 12:13:09.950630 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2af0656a_169d_42fe_8efb_5258bc56af56.slice/crio-0afe1870f3777aa43016754bf2643fbfc2fb505de918741892f71c54c899edaf WatchSource:0}: Error finding container 0afe1870f3777aa43016754bf2643fbfc2fb505de918741892f71c54c899edaf: Status 404 returned error can't find the container with id 0afe1870f3777aa43016754bf2643fbfc2fb505de918741892f71c54c899edaf Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.405757 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.425675 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.472336 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-srnjf" event={"ID":"2af0656a-169d-42fe-8efb-5258bc56af56","Type":"ContainerStarted","Data":"26f61a4f99d2a72d14730c0d17560bfffb50a7eea0e31c5b4dea48188c53e6fb"} Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.472382 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-srnjf" event={"ID":"2af0656a-169d-42fe-8efb-5258bc56af56","Type":"ContainerStarted","Data":"656ef519ad1181d162b9fe8ea90de48e7d48355099403e9d619b445009dd139c"} Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.472393 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-srnjf" event={"ID":"2af0656a-169d-42fe-8efb-5258bc56af56","Type":"ContainerStarted","Data":"0afe1870f3777aa43016754bf2643fbfc2fb505de918741892f71c54c899edaf"} Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.473129 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.474260 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerStarted","Data":"5a9dba4926deba2e8b891f2db18e7ce4c68880e9a96831827ae9c38bbe5f2c82"} Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.490237 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-srnjf" podStartSLOduration=2.490210548 podStartE2EDuration="2.490210548s" podCreationTimestamp="2026-03-11 12:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:13:10.486280293 +0000 UTC m=+877.077544270" watchObservedRunningTime="2026-03-11 12:13:10.490210548 +0000 UTC m=+877.081474525" Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.530238 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wqwrt" Mar 11 12:13:10 crc kubenswrapper[4816]: W0311 12:13:10.550384 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43ec0f0d_8425_4dc4_9aa2_f1f85a26548c.slice/crio-33eb6899821d126ac4e3419b0387e50f85c3ebaca59cfbc9c9f2233bc4cc33f1 WatchSource:0}: Error finding container 33eb6899821d126ac4e3419b0387e50f85c3ebaca59cfbc9c9f2233bc4cc33f1: Status 404 returned error can't find the container with id 33eb6899821d126ac4e3419b0387e50f85c3ebaca59cfbc9c9f2233bc4cc33f1 Mar 11 12:13:11 crc kubenswrapper[4816]: I0311 12:13:11.484997 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wqwrt" event={"ID":"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c","Type":"ContainerStarted","Data":"56aeba0abf7b5e7e1d66f34e132c4143f9384145ef2cab676af43e201f2dc56d"} Mar 11 12:13:11 crc kubenswrapper[4816]: I0311 12:13:11.485282 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wqwrt" event={"ID":"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c","Type":"ContainerStarted","Data":"f0c9fce879c4599b760a54e8ac88a1933ff3d5cfce1c9f9e04addf1fbefaa8b9"} Mar 11 12:13:11 crc kubenswrapper[4816]: I0311 12:13:11.485292 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wqwrt" event={"ID":"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c","Type":"ContainerStarted","Data":"33eb6899821d126ac4e3419b0387e50f85c3ebaca59cfbc9c9f2233bc4cc33f1"} Mar 11 12:13:11 crc kubenswrapper[4816]: I0311 12:13:11.485376 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-wqwrt" Mar 11 12:13:14 crc kubenswrapper[4816]: I0311 12:13:14.154648 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-wqwrt" podStartSLOduration=6.154629544 podStartE2EDuration="6.154629544s" podCreationTimestamp="2026-03-11 12:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:13:11.507910939 +0000 UTC m=+878.099174906" watchObservedRunningTime="2026-03-11 12:13:14.154629544 +0000 UTC m=+880.745893531" Mar 11 12:13:18 crc kubenswrapper[4816]: I0311 12:13:18.568467 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" event={"ID":"6512814f-09cf-4b97-a1d6-ec99bcbf1525","Type":"ContainerStarted","Data":"99c19e17cdcff645fffab67a9ef4bb0de3f6f0cf791dc49b158ed9a3fdf7dd6f"} Mar 11 12:13:18 crc kubenswrapper[4816]: I0311 12:13:18.569220 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:18 crc kubenswrapper[4816]: I0311 12:13:18.570728 4816 generic.go:334] "Generic (PLEG): container finished" podID="00616041-f382-4b2a-a7ef-b75a14621ce1" containerID="80d63f9330ab5cde4f6b1a4cc8204f25a2e19e4772b7b59ea3c468ba8092dd4c" exitCode=0 Mar 11 12:13:18 crc kubenswrapper[4816]: I0311 12:13:18.570785 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerDied","Data":"80d63f9330ab5cde4f6b1a4cc8204f25a2e19e4772b7b59ea3c468ba8092dd4c"} Mar 11 12:13:18 crc kubenswrapper[4816]: I0311 12:13:18.616701 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" podStartSLOduration=2.213666838 podStartE2EDuration="10.616683049s" podCreationTimestamp="2026-03-11 12:13:08 +0000 UTC" firstStartedPulling="2026-03-11 12:13:09.230551228 +0000 UTC m=+875.821815195" lastFinishedPulling="2026-03-11 12:13:17.633567419 +0000 UTC m=+884.224831406" observedRunningTime="2026-03-11 12:13:18.591237695 +0000 UTC m=+885.182501662" watchObservedRunningTime="2026-03-11 12:13:18.616683049 +0000 UTC m=+885.207947016" Mar 11 12:13:19 crc kubenswrapper[4816]: I0311 12:13:19.580336 4816 generic.go:334] "Generic (PLEG): container finished" podID="00616041-f382-4b2a-a7ef-b75a14621ce1" containerID="af43bf7bbfdb9660cc95118ff47395fdf22dc547009038f832f159d9c187fa86" exitCode=0 Mar 11 12:13:19 crc kubenswrapper[4816]: I0311 12:13:19.580436 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerDied","Data":"af43bf7bbfdb9660cc95118ff47395fdf22dc547009038f832f159d9c187fa86"} Mar 11 12:13:20 crc kubenswrapper[4816]: I0311 12:13:20.534986 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-wqwrt" Mar 11 12:13:20 crc kubenswrapper[4816]: I0311 12:13:20.589854 4816 generic.go:334] "Generic (PLEG): container finished" podID="00616041-f382-4b2a-a7ef-b75a14621ce1" containerID="3f07e740a2d2dea998fafa7f845ed22af997ceb1958d238a27da269158256704" exitCode=0 Mar 11 12:13:20 crc kubenswrapper[4816]: I0311 12:13:20.589911 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerDied","Data":"3f07e740a2d2dea998fafa7f845ed22af997ceb1958d238a27da269158256704"} Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.600766 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerStarted","Data":"fd538c5d6fee4cdac18f05446f745df698fa9cfbd7711c66de0900f134715ab0"} Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.601099 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.601112 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerStarted","Data":"b31aaec34f323c4075e4c70efe09df3a55e9f1a929b02ae47f26f89ad0ae1288"} Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.601150 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerStarted","Data":"6e3f03dce05c9643d39eaa480e95f0e4f42021822a1ae2d8ba30928026a1c9f2"} Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.601160 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerStarted","Data":"3966ade3f260067a8fd8341fb5bc7c77d0da7251038b72a02ba38dd87423d2f7"} Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.601171 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerStarted","Data":"ce18729fc1f9a1876ee7d2356e9474d2b58fa559af4903db1d556fd615d2c70c"} Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.601179 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerStarted","Data":"f849a45ec28d27f080b810fcc5a49d7f4ccfb3711e1dca572cf9b3908c3ad3e7"} Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.620702 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-bjfwg" podStartSLOduration=5.667356175 podStartE2EDuration="13.620682355s" podCreationTimestamp="2026-03-11 12:13:08 +0000 UTC" firstStartedPulling="2026-03-11 12:13:09.697950237 +0000 UTC m=+876.289214234" lastFinishedPulling="2026-03-11 12:13:17.651276447 +0000 UTC m=+884.242540414" observedRunningTime="2026-03-11 12:13:21.618748039 +0000 UTC m=+888.210012006" watchObservedRunningTime="2026-03-11 12:13:21.620682355 +0000 UTC m=+888.211946322" Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.969422 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2"] Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.970837 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.975548 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.989413 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2"] Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.109475 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56ksr\" (UniqueName: \"kubernetes.io/projected/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-kube-api-access-56ksr\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.109604 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.109718 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.211117 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.211561 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.211683 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.211937 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.211989 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56ksr\" (UniqueName: \"kubernetes.io/projected/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-kube-api-access-56ksr\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.231287 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56ksr\" (UniqueName: \"kubernetes.io/projected/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-kube-api-access-56ksr\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.293831 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.774549 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2"] Mar 11 12:13:22 crc kubenswrapper[4816]: W0311 12:13:22.784382 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9c6bbc7_62af_4c3a_ac05_1897b9f00080.slice/crio-5a50909b238048d7291670196a1a9dcbfff784f61d6b5145b42b735b88bf62cd WatchSource:0}: Error finding container 5a50909b238048d7291670196a1a9dcbfff784f61d6b5145b42b735b88bf62cd: Status 404 returned error can't find the container with id 5a50909b238048d7291670196a1a9dcbfff784f61d6b5145b42b735b88bf62cd Mar 11 12:13:23 crc kubenswrapper[4816]: I0311 12:13:23.616869 4816 generic.go:334] "Generic (PLEG): container finished" podID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerID="5700c7771a4c284d0e3dc33cb363063ce713c4c0827d2527d184d380ea4d0beb" exitCode=0 Mar 11 12:13:23 crc kubenswrapper[4816]: I0311 12:13:23.617019 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" event={"ID":"f9c6bbc7-62af-4c3a-ac05-1897b9f00080","Type":"ContainerDied","Data":"5700c7771a4c284d0e3dc33cb363063ce713c4c0827d2527d184d380ea4d0beb"} Mar 11 12:13:23 crc kubenswrapper[4816]: I0311 12:13:23.617297 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" event={"ID":"f9c6bbc7-62af-4c3a-ac05-1897b9f00080","Type":"ContainerStarted","Data":"5a50909b238048d7291670196a1a9dcbfff784f61d6b5145b42b735b88bf62cd"} Mar 11 12:13:24 crc kubenswrapper[4816]: I0311 12:13:24.569190 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:24 crc kubenswrapper[4816]: I0311 12:13:24.639824 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:27 crc kubenswrapper[4816]: I0311 12:13:27.672716 4816 generic.go:334] "Generic (PLEG): container finished" podID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerID="c309b14826419376450e34ef7e71c711ef13f1aa8e7a532324a580d7febe6e32" exitCode=0 Mar 11 12:13:27 crc kubenswrapper[4816]: I0311 12:13:27.672851 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" event={"ID":"f9c6bbc7-62af-4c3a-ac05-1897b9f00080","Type":"ContainerDied","Data":"c309b14826419376450e34ef7e71c711ef13f1aa8e7a532324a580d7febe6e32"} Mar 11 12:13:28 crc kubenswrapper[4816]: I0311 12:13:28.684951 4816 generic.go:334] "Generic (PLEG): container finished" podID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerID="5f6bb57f799d21785887f4df7ec76a35a9787bef2a1f8eece1bb78c306d6fc80" exitCode=0 Mar 11 12:13:28 crc kubenswrapper[4816]: I0311 12:13:28.685083 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" event={"ID":"f9c6bbc7-62af-4c3a-ac05-1897b9f00080","Type":"ContainerDied","Data":"5f6bb57f799d21785887f4df7ec76a35a9787bef2a1f8eece1bb78c306d6fc80"} Mar 11 12:13:28 crc kubenswrapper[4816]: I0311 12:13:28.966352 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:29 crc kubenswrapper[4816]: I0311 12:13:29.673170 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.038268 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.137877 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-util\") pod \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.138002 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-bundle\") pod \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.138091 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56ksr\" (UniqueName: \"kubernetes.io/projected/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-kube-api-access-56ksr\") pod \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.138769 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-bundle" (OuterVolumeSpecName: "bundle") pod "f9c6bbc7-62af-4c3a-ac05-1897b9f00080" (UID: "f9c6bbc7-62af-4c3a-ac05-1897b9f00080"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.147548 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-util" (OuterVolumeSpecName: "util") pod "f9c6bbc7-62af-4c3a-ac05-1897b9f00080" (UID: "f9c6bbc7-62af-4c3a-ac05-1897b9f00080"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.153864 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-kube-api-access-56ksr" (OuterVolumeSpecName: "kube-api-access-56ksr") pod "f9c6bbc7-62af-4c3a-ac05-1897b9f00080" (UID: "f9c6bbc7-62af-4c3a-ac05-1897b9f00080"). InnerVolumeSpecName "kube-api-access-56ksr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.240620 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-util\") on node \"crc\" DevicePath \"\"" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.240834 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.240897 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56ksr\" (UniqueName: \"kubernetes.io/projected/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-kube-api-access-56ksr\") on node \"crc\" DevicePath \"\"" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.705765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" event={"ID":"f9c6bbc7-62af-4c3a-ac05-1897b9f00080","Type":"ContainerDied","Data":"5a50909b238048d7291670196a1a9dcbfff784f61d6b5145b42b735b88bf62cd"} Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.705824 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a50909b238048d7291670196a1a9dcbfff784f61d6b5145b42b735b88bf62cd" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.705875 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.101573 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5"] Mar 11 12:13:35 crc kubenswrapper[4816]: E0311 12:13:35.102720 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerName="util" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.102755 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerName="util" Mar 11 12:13:35 crc kubenswrapper[4816]: E0311 12:13:35.102799 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerName="extract" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.102818 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerName="extract" Mar 11 12:13:35 crc kubenswrapper[4816]: E0311 12:13:35.102841 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerName="pull" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.102855 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerName="pull" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.103071 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerName="extract" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.104018 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.105870 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.106059 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-8czc4" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.108037 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.139847 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5"] Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.217302 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/950f7daa-b6eb-488f-877d-774c73576ed0-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-bpwb5\" (UID: \"950f7daa-b6eb-488f-877d-774c73576ed0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.217513 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xvq2\" (UniqueName: \"kubernetes.io/projected/950f7daa-b6eb-488f-877d-774c73576ed0-kube-api-access-2xvq2\") pod \"cert-manager-operator-controller-manager-66c8bdd694-bpwb5\" (UID: \"950f7daa-b6eb-488f-877d-774c73576ed0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.319120 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/950f7daa-b6eb-488f-877d-774c73576ed0-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-bpwb5\" (UID: \"950f7daa-b6eb-488f-877d-774c73576ed0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.319221 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xvq2\" (UniqueName: \"kubernetes.io/projected/950f7daa-b6eb-488f-877d-774c73576ed0-kube-api-access-2xvq2\") pod \"cert-manager-operator-controller-manager-66c8bdd694-bpwb5\" (UID: \"950f7daa-b6eb-488f-877d-774c73576ed0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.320156 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/950f7daa-b6eb-488f-877d-774c73576ed0-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-bpwb5\" (UID: \"950f7daa-b6eb-488f-877d-774c73576ed0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.344733 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xvq2\" (UniqueName: \"kubernetes.io/projected/950f7daa-b6eb-488f-877d-774c73576ed0-kube-api-access-2xvq2\") pod \"cert-manager-operator-controller-manager-66c8bdd694-bpwb5\" (UID: \"950f7daa-b6eb-488f-877d-774c73576ed0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.449606 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.769951 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5"] Mar 11 12:13:36 crc kubenswrapper[4816]: I0311 12:13:36.743653 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" event={"ID":"950f7daa-b6eb-488f-877d-774c73576ed0","Type":"ContainerStarted","Data":"c21eec66beaf6443bc73563e51c1f973bf98c031492c7dbd502a2d0bcca29bcf"} Mar 11 12:13:39 crc kubenswrapper[4816]: I0311 12:13:39.574718 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:39 crc kubenswrapper[4816]: I0311 12:13:39.769894 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" event={"ID":"950f7daa-b6eb-488f-877d-774c73576ed0","Type":"ContainerStarted","Data":"896fe8d71362e8ef9f1d3c398422eedc31ba0fe47a90af6c276e204e8ec911fb"} Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.148751 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" podStartSLOduration=5.484942669 podStartE2EDuration="9.148728416s" podCreationTimestamp="2026-03-11 12:13:35 +0000 UTC" firstStartedPulling="2026-03-11 12:13:35.783388053 +0000 UTC m=+902.374652030" lastFinishedPulling="2026-03-11 12:13:39.44717381 +0000 UTC m=+906.038437777" observedRunningTime="2026-03-11 12:13:39.827887876 +0000 UTC m=+906.419151843" watchObservedRunningTime="2026-03-11 12:13:44.148728416 +0000 UTC m=+910.739992383" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.151587 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-2jk7k"] Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.153193 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.157152 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-n6ssm" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.157161 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.157622 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.166997 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-2jk7k"] Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.260402 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n7vv\" (UniqueName: \"kubernetes.io/projected/0a41e6b9-3b80-4eed-a8db-65aa010f449d-kube-api-access-7n7vv\") pod \"cert-manager-webhook-6888856db4-2jk7k\" (UID: \"0a41e6b9-3b80-4eed-a8db-65aa010f449d\") " pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.260487 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a41e6b9-3b80-4eed-a8db-65aa010f449d-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-2jk7k\" (UID: \"0a41e6b9-3b80-4eed-a8db-65aa010f449d\") " pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.362338 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a41e6b9-3b80-4eed-a8db-65aa010f449d-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-2jk7k\" (UID: \"0a41e6b9-3b80-4eed-a8db-65aa010f449d\") " pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.362454 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n7vv\" (UniqueName: \"kubernetes.io/projected/0a41e6b9-3b80-4eed-a8db-65aa010f449d-kube-api-access-7n7vv\") pod \"cert-manager-webhook-6888856db4-2jk7k\" (UID: \"0a41e6b9-3b80-4eed-a8db-65aa010f449d\") " pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.381392 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n7vv\" (UniqueName: \"kubernetes.io/projected/0a41e6b9-3b80-4eed-a8db-65aa010f449d-kube-api-access-7n7vv\") pod \"cert-manager-webhook-6888856db4-2jk7k\" (UID: \"0a41e6b9-3b80-4eed-a8db-65aa010f449d\") " pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.384364 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a41e6b9-3b80-4eed-a8db-65aa010f449d-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-2jk7k\" (UID: \"0a41e6b9-3b80-4eed-a8db-65aa010f449d\") " pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.474569 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.929641 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-2jk7k"] Mar 11 12:13:45 crc kubenswrapper[4816]: I0311 12:13:45.814005 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" event={"ID":"0a41e6b9-3b80-4eed-a8db-65aa010f449d","Type":"ContainerStarted","Data":"76f7aff87bcdf58aca3226d7bd33c4c416fb8440f06999d8f9216cbb0e9fe1b5"} Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.387802 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z6224"] Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.389346 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.408390 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z6224"] Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.524263 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-catalog-content\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.524298 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwbh4\" (UniqueName: \"kubernetes.io/projected/bb816e33-9a16-4326-8058-c328fabcab45-kube-api-access-fwbh4\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.524323 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-utilities\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.625699 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwbh4\" (UniqueName: \"kubernetes.io/projected/bb816e33-9a16-4326-8058-c328fabcab45-kube-api-access-fwbh4\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.625744 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-catalog-content\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.625774 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-utilities\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.626391 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-utilities\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.626565 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-catalog-content\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.649945 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwbh4\" (UniqueName: \"kubernetes.io/projected/bb816e33-9a16-4326-8058-c328fabcab45-kube-api-access-fwbh4\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.710862 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.256913 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fgzw7"] Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.257830 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.261575 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-z54x9" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.268169 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fgzw7"] Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.353143 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fgzw7\" (UID: \"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.353200 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szzdz\" (UniqueName: \"kubernetes.io/projected/f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8-kube-api-access-szzdz\") pod \"cert-manager-cainjector-5545bd876-fgzw7\" (UID: \"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.454110 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fgzw7\" (UID: \"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.454170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szzdz\" (UniqueName: \"kubernetes.io/projected/f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8-kube-api-access-szzdz\") pod \"cert-manager-cainjector-5545bd876-fgzw7\" (UID: \"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.472954 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fgzw7\" (UID: \"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.485439 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szzdz\" (UniqueName: \"kubernetes.io/projected/f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8-kube-api-access-szzdz\") pod \"cert-manager-cainjector-5545bd876-fgzw7\" (UID: \"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.578595 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.795404 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sn8n6"] Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.797722 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.815977 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sn8n6"] Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.885187 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-catalog-content\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.885336 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-utilities\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.885400 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdjct\" (UniqueName: \"kubernetes.io/projected/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-kube-api-access-zdjct\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.987504 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-catalog-content\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.987613 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-utilities\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.987651 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdjct\" (UniqueName: \"kubernetes.io/projected/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-kube-api-access-zdjct\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.989649 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-catalog-content\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.990193 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-utilities\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.991614 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z6224"] Mar 11 12:13:51 crc kubenswrapper[4816]: W0311 12:13:51.003449 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb816e33_9a16_4326_8058_c328fabcab45.slice/crio-7dbded14f0f4fb04bf34588fd808017260ed068974b08e77201c2fe4778827be WatchSource:0}: Error finding container 7dbded14f0f4fb04bf34588fd808017260ed068974b08e77201c2fe4778827be: Status 404 returned error can't find the container with id 7dbded14f0f4fb04bf34588fd808017260ed068974b08e77201c2fe4778827be Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.044872 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdjct\" (UniqueName: \"kubernetes.io/projected/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-kube-api-access-zdjct\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.124831 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fgzw7"] Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.170290 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.431864 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sn8n6"] Mar 11 12:13:51 crc kubenswrapper[4816]: W0311 12:13:51.437182 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff0a6696_3d0e_4173_9c1c_260bfdd757fb.slice/crio-1126d3ebcef013ee6f9236b25810177dab2d52145cc78104f25351c76ce07c7c WatchSource:0}: Error finding container 1126d3ebcef013ee6f9236b25810177dab2d52145cc78104f25351c76ce07c7c: Status 404 returned error can't find the container with id 1126d3ebcef013ee6f9236b25810177dab2d52145cc78104f25351c76ce07c7c Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.890306 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerID="39208e7933d3e745276296861bf7019ae82dfe513403abeb752230d2b7d5f71e" exitCode=0 Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.890399 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn8n6" event={"ID":"ff0a6696-3d0e-4173-9c1c-260bfdd757fb","Type":"ContainerDied","Data":"39208e7933d3e745276296861bf7019ae82dfe513403abeb752230d2b7d5f71e"} Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.890449 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn8n6" event={"ID":"ff0a6696-3d0e-4173-9c1c-260bfdd757fb","Type":"ContainerStarted","Data":"1126d3ebcef013ee6f9236b25810177dab2d52145cc78104f25351c76ce07c7c"} Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.893697 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" event={"ID":"0a41e6b9-3b80-4eed-a8db-65aa010f449d","Type":"ContainerStarted","Data":"977d30cdf14676945adae36064bcac3a2355a3f11d66a1ec67fe0d29af62ae53"} Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.894097 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.896672 4816 generic.go:334] "Generic (PLEG): container finished" podID="bb816e33-9a16-4326-8058-c328fabcab45" containerID="b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2" exitCode=0 Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.896725 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6224" event={"ID":"bb816e33-9a16-4326-8058-c328fabcab45","Type":"ContainerDied","Data":"b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2"} Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.896750 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6224" event={"ID":"bb816e33-9a16-4326-8058-c328fabcab45","Type":"ContainerStarted","Data":"7dbded14f0f4fb04bf34588fd808017260ed068974b08e77201c2fe4778827be"} Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.898400 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" event={"ID":"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8","Type":"ContainerStarted","Data":"1d11c941cff6d506aff5ffb7382bcdc6782f901e85211c15cefc00dcdfda4296"} Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.898431 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" event={"ID":"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8","Type":"ContainerStarted","Data":"1b46c040faba5a5471994d87bab1a61c54b7186223efa24040c1e0f27e8be2b0"} Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.950671 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" podStartSLOduration=2.080280661 podStartE2EDuration="7.950650471s" podCreationTimestamp="2026-03-11 12:13:44 +0000 UTC" firstStartedPulling="2026-03-11 12:13:44.937522773 +0000 UTC m=+911.528786730" lastFinishedPulling="2026-03-11 12:13:50.807892583 +0000 UTC m=+917.399156540" observedRunningTime="2026-03-11 12:13:51.94553552 +0000 UTC m=+918.536799487" watchObservedRunningTime="2026-03-11 12:13:51.950650471 +0000 UTC m=+918.541914428" Mar 11 12:13:52 crc kubenswrapper[4816]: I0311 12:13:52.007297 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" podStartSLOduration=3.007276316 podStartE2EDuration="3.007276316s" podCreationTimestamp="2026-03-11 12:13:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:13:51.999005953 +0000 UTC m=+918.590269920" watchObservedRunningTime="2026-03-11 12:13:52.007276316 +0000 UTC m=+918.598540283" Mar 11 12:13:52 crc kubenswrapper[4816]: I0311 12:13:52.905028 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn8n6" event={"ID":"ff0a6696-3d0e-4173-9c1c-260bfdd757fb","Type":"ContainerStarted","Data":"60d6fa46e4d2deba8d1be2490841b0784c3be0ee06e9d85e187cb81fe38f5f2c"} Mar 11 12:13:52 crc kubenswrapper[4816]: I0311 12:13:52.907272 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6224" event={"ID":"bb816e33-9a16-4326-8058-c328fabcab45","Type":"ContainerStarted","Data":"50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89"} Mar 11 12:13:53 crc kubenswrapper[4816]: I0311 12:13:53.914830 4816 generic.go:334] "Generic (PLEG): container finished" podID="bb816e33-9a16-4326-8058-c328fabcab45" containerID="50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89" exitCode=0 Mar 11 12:13:53 crc kubenswrapper[4816]: I0311 12:13:53.914910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6224" event={"ID":"bb816e33-9a16-4326-8058-c328fabcab45","Type":"ContainerDied","Data":"50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89"} Mar 11 12:13:53 crc kubenswrapper[4816]: I0311 12:13:53.916938 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerID="60d6fa46e4d2deba8d1be2490841b0784c3be0ee06e9d85e187cb81fe38f5f2c" exitCode=0 Mar 11 12:13:53 crc kubenswrapper[4816]: I0311 12:13:53.916986 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn8n6" event={"ID":"ff0a6696-3d0e-4173-9c1c-260bfdd757fb","Type":"ContainerDied","Data":"60d6fa46e4d2deba8d1be2490841b0784c3be0ee06e9d85e187cb81fe38f5f2c"} Mar 11 12:13:54 crc kubenswrapper[4816]: I0311 12:13:54.927229 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6224" event={"ID":"bb816e33-9a16-4326-8058-c328fabcab45","Type":"ContainerStarted","Data":"d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf"} Mar 11 12:13:54 crc kubenswrapper[4816]: I0311 12:13:54.932000 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn8n6" event={"ID":"ff0a6696-3d0e-4173-9c1c-260bfdd757fb","Type":"ContainerStarted","Data":"81cdb893a840fa2be39a4c7d019e4bfb168456690721e9768fa1711d4887d3df"} Mar 11 12:13:54 crc kubenswrapper[4816]: I0311 12:13:54.951999 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z6224" podStartSLOduration=4.370797718 podStartE2EDuration="6.951982556s" podCreationTimestamp="2026-03-11 12:13:48 +0000 UTC" firstStartedPulling="2026-03-11 12:13:51.898342673 +0000 UTC m=+918.489606640" lastFinishedPulling="2026-03-11 12:13:54.479527511 +0000 UTC m=+921.070791478" observedRunningTime="2026-03-11 12:13:54.949939256 +0000 UTC m=+921.541203223" watchObservedRunningTime="2026-03-11 12:13:54.951982556 +0000 UTC m=+921.543246523" Mar 11 12:13:54 crc kubenswrapper[4816]: I0311 12:13:54.969805 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sn8n6" podStartSLOduration=2.408874757 podStartE2EDuration="4.969782869s" podCreationTimestamp="2026-03-11 12:13:50 +0000 UTC" firstStartedPulling="2026-03-11 12:13:51.89247009 +0000 UTC m=+918.483734067" lastFinishedPulling="2026-03-11 12:13:54.453378212 +0000 UTC m=+921.044642179" observedRunningTime="2026-03-11 12:13:54.966652227 +0000 UTC m=+921.557916194" watchObservedRunningTime="2026-03-11 12:13:54.969782869 +0000 UTC m=+921.561046836" Mar 11 12:13:58 crc kubenswrapper[4816]: I0311 12:13:58.711975 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:58 crc kubenswrapper[4816]: I0311 12:13:58.712856 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:58 crc kubenswrapper[4816]: I0311 12:13:58.762538 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:59 crc kubenswrapper[4816]: I0311 12:13:59.477349 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.143155 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553854-hbf96"] Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.146682 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553854-hbf96"] Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.146897 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553854-hbf96" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.155642 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.155882 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.156080 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.233130 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b7s4\" (UniqueName: \"kubernetes.io/projected/af8a107b-6295-42d4-b64b-7841171f67f3-kube-api-access-5b7s4\") pod \"auto-csr-approver-29553854-hbf96\" (UID: \"af8a107b-6295-42d4-b64b-7841171f67f3\") " pod="openshift-infra/auto-csr-approver-29553854-hbf96" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.335548 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b7s4\" (UniqueName: \"kubernetes.io/projected/af8a107b-6295-42d4-b64b-7841171f67f3-kube-api-access-5b7s4\") pod \"auto-csr-approver-29553854-hbf96\" (UID: \"af8a107b-6295-42d4-b64b-7841171f67f3\") " pod="openshift-infra/auto-csr-approver-29553854-hbf96" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.354965 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b7s4\" (UniqueName: \"kubernetes.io/projected/af8a107b-6295-42d4-b64b-7841171f67f3-kube-api-access-5b7s4\") pod \"auto-csr-approver-29553854-hbf96\" (UID: \"af8a107b-6295-42d4-b64b-7841171f67f3\") " pod="openshift-infra/auto-csr-approver-29553854-hbf96" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.483487 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553854-hbf96" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.758919 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553854-hbf96"] Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.981060 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553854-hbf96" event={"ID":"af8a107b-6295-42d4-b64b-7841171f67f3","Type":"ContainerStarted","Data":"dfea93cb7aee576a6b4033dd21e2e2174a0e4a253adbe8914d90cdec35819e27"} Mar 11 12:14:01 crc kubenswrapper[4816]: I0311 12:14:01.171415 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:14:01 crc kubenswrapper[4816]: I0311 12:14:01.171471 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:14:01 crc kubenswrapper[4816]: I0311 12:14:01.222658 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.040802 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.089462 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sn8n6"] Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.567789 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-62cp5"] Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.571062 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.577580 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-vh6t5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.594388 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-62cp5"] Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.668716 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e50b3f6b-4679-4337-a9cf-478aa2fb5800-bound-sa-token\") pod \"cert-manager-545d4d4674-62cp5\" (UID: \"e50b3f6b-4679-4337-a9cf-478aa2fb5800\") " pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.669447 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szk84\" (UniqueName: \"kubernetes.io/projected/e50b3f6b-4679-4337-a9cf-478aa2fb5800-kube-api-access-szk84\") pod \"cert-manager-545d4d4674-62cp5\" (UID: \"e50b3f6b-4679-4337-a9cf-478aa2fb5800\") " pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.771541 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e50b3f6b-4679-4337-a9cf-478aa2fb5800-bound-sa-token\") pod \"cert-manager-545d4d4674-62cp5\" (UID: \"e50b3f6b-4679-4337-a9cf-478aa2fb5800\") " pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.771675 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szk84\" (UniqueName: \"kubernetes.io/projected/e50b3f6b-4679-4337-a9cf-478aa2fb5800-kube-api-access-szk84\") pod \"cert-manager-545d4d4674-62cp5\" (UID: \"e50b3f6b-4679-4337-a9cf-478aa2fb5800\") " pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.801634 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szk84\" (UniqueName: \"kubernetes.io/projected/e50b3f6b-4679-4337-a9cf-478aa2fb5800-kube-api-access-szk84\") pod \"cert-manager-545d4d4674-62cp5\" (UID: \"e50b3f6b-4679-4337-a9cf-478aa2fb5800\") " pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.803871 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e50b3f6b-4679-4337-a9cf-478aa2fb5800-bound-sa-token\") pod \"cert-manager-545d4d4674-62cp5\" (UID: \"e50b3f6b-4679-4337-a9cf-478aa2fb5800\") " pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.895693 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:03 crc kubenswrapper[4816]: I0311 12:14:03.210311 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-62cp5"] Mar 11 12:14:03 crc kubenswrapper[4816]: W0311 12:14:03.213630 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode50b3f6b_4679_4337_a9cf_478aa2fb5800.slice/crio-8f6b1cee2306078cb8905b6b956094561ff8d4a532879cffb9364772e92333cd WatchSource:0}: Error finding container 8f6b1cee2306078cb8905b6b956094561ff8d4a532879cffb9364772e92333cd: Status 404 returned error can't find the container with id 8f6b1cee2306078cb8905b6b956094561ff8d4a532879cffb9364772e92333cd Mar 11 12:14:03 crc kubenswrapper[4816]: I0311 12:14:03.866469 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hl4tp"] Mar 11 12:14:03 crc kubenswrapper[4816]: I0311 12:14:03.868514 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:03 crc kubenswrapper[4816]: I0311 12:14:03.880017 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl4tp"] Mar 11 12:14:03 crc kubenswrapper[4816]: I0311 12:14:03.992730 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-catalog-content\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:03 crc kubenswrapper[4816]: I0311 12:14:03.993155 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-utilities\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:03 crc kubenswrapper[4816]: I0311 12:14:03.993394 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dg7l\" (UniqueName: \"kubernetes.io/projected/60a4785b-2b65-4a82-984d-611750e6e161-kube-api-access-2dg7l\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.005912 4816 generic.go:334] "Generic (PLEG): container finished" podID="af8a107b-6295-42d4-b64b-7841171f67f3" containerID="37568547e2b255f52263c2130857ff28c18773cdb28a0d8fb13178ff2dc5ab7f" exitCode=0 Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.006015 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553854-hbf96" event={"ID":"af8a107b-6295-42d4-b64b-7841171f67f3","Type":"ContainerDied","Data":"37568547e2b255f52263c2130857ff28c18773cdb28a0d8fb13178ff2dc5ab7f"} Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.007582 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sn8n6" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="registry-server" containerID="cri-o://81cdb893a840fa2be39a4c7d019e4bfb168456690721e9768fa1711d4887d3df" gracePeriod=2 Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.008773 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-62cp5" event={"ID":"e50b3f6b-4679-4337-a9cf-478aa2fb5800","Type":"ContainerStarted","Data":"fd2724e590cc7a2a85d7400f164e79ed204e7298dae962307ec0591a74334974"} Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.008799 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-62cp5" event={"ID":"e50b3f6b-4679-4337-a9cf-478aa2fb5800","Type":"ContainerStarted","Data":"8f6b1cee2306078cb8905b6b956094561ff8d4a532879cffb9364772e92333cd"} Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.078695 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-62cp5" podStartSLOduration=2.07866855 podStartE2EDuration="2.07866855s" podCreationTimestamp="2026-03-11 12:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:14:04.073525119 +0000 UTC m=+930.664789076" watchObservedRunningTime="2026-03-11 12:14:04.07866855 +0000 UTC m=+930.669932517" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.095179 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-utilities\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.095238 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-utilities\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.095315 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dg7l\" (UniqueName: \"kubernetes.io/projected/60a4785b-2b65-4a82-984d-611750e6e161-kube-api-access-2dg7l\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.095342 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-catalog-content\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.103670 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-catalog-content\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.129417 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dg7l\" (UniqueName: \"kubernetes.io/projected/60a4785b-2b65-4a82-984d-611750e6e161-kube-api-access-2dg7l\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.201967 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.510711 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl4tp"] Mar 11 12:14:04 crc kubenswrapper[4816]: W0311 12:14:04.524027 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60a4785b_2b65_4a82_984d_611750e6e161.slice/crio-f45dd7f636b52fe2c1b2fb545c6a5ddc989520a45cd358b29620164b0db18c6d WatchSource:0}: Error finding container f45dd7f636b52fe2c1b2fb545c6a5ddc989520a45cd358b29620164b0db18c6d: Status 404 returned error can't find the container with id f45dd7f636b52fe2c1b2fb545c6a5ddc989520a45cd358b29620164b0db18c6d Mar 11 12:14:05 crc kubenswrapper[4816]: I0311 12:14:05.015276 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl4tp" event={"ID":"60a4785b-2b65-4a82-984d-611750e6e161","Type":"ContainerStarted","Data":"f45dd7f636b52fe2c1b2fb545c6a5ddc989520a45cd358b29620164b0db18c6d"} Mar 11 12:14:05 crc kubenswrapper[4816]: I0311 12:14:05.323848 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553854-hbf96" Mar 11 12:14:05 crc kubenswrapper[4816]: I0311 12:14:05.415754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b7s4\" (UniqueName: \"kubernetes.io/projected/af8a107b-6295-42d4-b64b-7841171f67f3-kube-api-access-5b7s4\") pod \"af8a107b-6295-42d4-b64b-7841171f67f3\" (UID: \"af8a107b-6295-42d4-b64b-7841171f67f3\") " Mar 11 12:14:05 crc kubenswrapper[4816]: I0311 12:14:05.425488 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8a107b-6295-42d4-b64b-7841171f67f3-kube-api-access-5b7s4" (OuterVolumeSpecName: "kube-api-access-5b7s4") pod "af8a107b-6295-42d4-b64b-7841171f67f3" (UID: "af8a107b-6295-42d4-b64b-7841171f67f3"). InnerVolumeSpecName "kube-api-access-5b7s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:14:05 crc kubenswrapper[4816]: I0311 12:14:05.517989 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b7s4\" (UniqueName: \"kubernetes.io/projected/af8a107b-6295-42d4-b64b-7841171f67f3-kube-api-access-5b7s4\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.026450 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553854-hbf96" event={"ID":"af8a107b-6295-42d4-b64b-7841171f67f3","Type":"ContainerDied","Data":"dfea93cb7aee576a6b4033dd21e2e2174a0e4a253adbe8914d90cdec35819e27"} Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.026490 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553854-hbf96" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.026510 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfea93cb7aee576a6b4033dd21e2e2174a0e4a253adbe8914d90cdec35819e27" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.043567 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerID="81cdb893a840fa2be39a4c7d019e4bfb168456690721e9768fa1711d4887d3df" exitCode=0 Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.043795 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn8n6" event={"ID":"ff0a6696-3d0e-4173-9c1c-260bfdd757fb","Type":"ContainerDied","Data":"81cdb893a840fa2be39a4c7d019e4bfb168456690721e9768fa1711d4887d3df"} Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.050903 4816 generic.go:334] "Generic (PLEG): container finished" podID="60a4785b-2b65-4a82-984d-611750e6e161" containerID="39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3" exitCode=0 Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.050971 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl4tp" event={"ID":"60a4785b-2b65-4a82-984d-611750e6e161","Type":"ContainerDied","Data":"39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3"} Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.236364 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.335580 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdjct\" (UniqueName: \"kubernetes.io/projected/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-kube-api-access-zdjct\") pod \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.335810 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-utilities\") pod \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.335939 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-catalog-content\") pod \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.336802 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-utilities" (OuterVolumeSpecName: "utilities") pod "ff0a6696-3d0e-4173-9c1c-260bfdd757fb" (UID: "ff0a6696-3d0e-4173-9c1c-260bfdd757fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.354569 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-kube-api-access-zdjct" (OuterVolumeSpecName: "kube-api-access-zdjct") pod "ff0a6696-3d0e-4173-9c1c-260bfdd757fb" (UID: "ff0a6696-3d0e-4173-9c1c-260bfdd757fb"). InnerVolumeSpecName "kube-api-access-zdjct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.411292 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff0a6696-3d0e-4173-9c1c-260bfdd757fb" (UID: "ff0a6696-3d0e-4173-9c1c-260bfdd757fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.426464 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553848-blbgg"] Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.435768 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553848-blbgg"] Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.438227 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.438296 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdjct\" (UniqueName: \"kubernetes.io/projected/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-kube-api-access-zdjct\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.438312 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:07 crc kubenswrapper[4816]: I0311 12:14:07.059301 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn8n6" event={"ID":"ff0a6696-3d0e-4173-9c1c-260bfdd757fb","Type":"ContainerDied","Data":"1126d3ebcef013ee6f9236b25810177dab2d52145cc78104f25351c76ce07c7c"} Mar 11 12:14:07 crc kubenswrapper[4816]: I0311 12:14:07.059350 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:14:07 crc kubenswrapper[4816]: I0311 12:14:07.059373 4816 scope.go:117] "RemoveContainer" containerID="81cdb893a840fa2be39a4c7d019e4bfb168456690721e9768fa1711d4887d3df" Mar 11 12:14:07 crc kubenswrapper[4816]: I0311 12:14:07.078841 4816 scope.go:117] "RemoveContainer" containerID="60d6fa46e4d2deba8d1be2490841b0784c3be0ee06e9d85e187cb81fe38f5f2c" Mar 11 12:14:07 crc kubenswrapper[4816]: I0311 12:14:07.094680 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sn8n6"] Mar 11 12:14:07 crc kubenswrapper[4816]: I0311 12:14:07.098877 4816 scope.go:117] "RemoveContainer" containerID="39208e7933d3e745276296861bf7019ae82dfe513403abeb752230d2b7d5f71e" Mar 11 12:14:07 crc kubenswrapper[4816]: I0311 12:14:07.099612 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sn8n6"] Mar 11 12:14:08 crc kubenswrapper[4816]: I0311 12:14:08.072605 4816 generic.go:334] "Generic (PLEG): container finished" podID="60a4785b-2b65-4a82-984d-611750e6e161" containerID="e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a" exitCode=0 Mar 11 12:14:08 crc kubenswrapper[4816]: I0311 12:14:08.072686 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl4tp" event={"ID":"60a4785b-2b65-4a82-984d-611750e6e161","Type":"ContainerDied","Data":"e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a"} Mar 11 12:14:08 crc kubenswrapper[4816]: I0311 12:14:08.143648 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf7a354c-a3ec-44fe-8e27-028abd12d7d9" path="/var/lib/kubelet/pods/cf7a354c-a3ec-44fe-8e27-028abd12d7d9/volumes" Mar 11 12:14:08 crc kubenswrapper[4816]: I0311 12:14:08.144932 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" path="/var/lib/kubelet/pods/ff0a6696-3d0e-4173-9c1c-260bfdd757fb/volumes" Mar 11 12:14:08 crc kubenswrapper[4816]: I0311 12:14:08.769129 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:14:09 crc kubenswrapper[4816]: I0311 12:14:09.084671 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl4tp" event={"ID":"60a4785b-2b65-4a82-984d-611750e6e161","Type":"ContainerStarted","Data":"8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11"} Mar 11 12:14:09 crc kubenswrapper[4816]: I0311 12:14:09.105301 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hl4tp" podStartSLOduration=3.644991573 podStartE2EDuration="6.105279977s" podCreationTimestamp="2026-03-11 12:14:03 +0000 UTC" firstStartedPulling="2026-03-11 12:14:06.054015743 +0000 UTC m=+932.645279710" lastFinishedPulling="2026-03-11 12:14:08.514304147 +0000 UTC m=+935.105568114" observedRunningTime="2026-03-11 12:14:09.101919708 +0000 UTC m=+935.693183675" watchObservedRunningTime="2026-03-11 12:14:09.105279977 +0000 UTC m=+935.696543954" Mar 11 12:14:11 crc kubenswrapper[4816]: I0311 12:14:11.657880 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z6224"] Mar 11 12:14:11 crc kubenswrapper[4816]: I0311 12:14:11.659100 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z6224" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="registry-server" containerID="cri-o://d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf" gracePeriod=2 Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.051018 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.108672 4816 generic.go:334] "Generic (PLEG): container finished" podID="bb816e33-9a16-4326-8058-c328fabcab45" containerID="d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf" exitCode=0 Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.108716 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6224" event={"ID":"bb816e33-9a16-4326-8058-c328fabcab45","Type":"ContainerDied","Data":"d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf"} Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.108744 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6224" event={"ID":"bb816e33-9a16-4326-8058-c328fabcab45","Type":"ContainerDied","Data":"7dbded14f0f4fb04bf34588fd808017260ed068974b08e77201c2fe4778827be"} Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.108763 4816 scope.go:117] "RemoveContainer" containerID="d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.108875 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.129585 4816 scope.go:117] "RemoveContainer" containerID="50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.144460 4816 scope.go:117] "RemoveContainer" containerID="b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.145232 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-catalog-content\") pod \"bb816e33-9a16-4326-8058-c328fabcab45\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.145370 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwbh4\" (UniqueName: \"kubernetes.io/projected/bb816e33-9a16-4326-8058-c328fabcab45-kube-api-access-fwbh4\") pod \"bb816e33-9a16-4326-8058-c328fabcab45\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.145396 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-utilities\") pod \"bb816e33-9a16-4326-8058-c328fabcab45\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.146360 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-utilities" (OuterVolumeSpecName: "utilities") pod "bb816e33-9a16-4326-8058-c328fabcab45" (UID: "bb816e33-9a16-4326-8058-c328fabcab45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.152032 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb816e33-9a16-4326-8058-c328fabcab45-kube-api-access-fwbh4" (OuterVolumeSpecName: "kube-api-access-fwbh4") pod "bb816e33-9a16-4326-8058-c328fabcab45" (UID: "bb816e33-9a16-4326-8058-c328fabcab45"). InnerVolumeSpecName "kube-api-access-fwbh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.185453 4816 scope.go:117] "RemoveContainer" containerID="d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf" Mar 11 12:14:12 crc kubenswrapper[4816]: E0311 12:14:12.185811 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf\": container with ID starting with d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf not found: ID does not exist" containerID="d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.185848 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf"} err="failed to get container status \"d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf\": rpc error: code = NotFound desc = could not find container \"d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf\": container with ID starting with d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf not found: ID does not exist" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.185877 4816 scope.go:117] "RemoveContainer" containerID="50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89" Mar 11 12:14:12 crc kubenswrapper[4816]: E0311 12:14:12.186102 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89\": container with ID starting with 50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89 not found: ID does not exist" containerID="50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.186123 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89"} err="failed to get container status \"50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89\": rpc error: code = NotFound desc = could not find container \"50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89\": container with ID starting with 50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89 not found: ID does not exist" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.186138 4816 scope.go:117] "RemoveContainer" containerID="b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2" Mar 11 12:14:12 crc kubenswrapper[4816]: E0311 12:14:12.186372 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2\": container with ID starting with b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2 not found: ID does not exist" containerID="b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.186392 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2"} err="failed to get container status \"b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2\": rpc error: code = NotFound desc = could not find container \"b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2\": container with ID starting with b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2 not found: ID does not exist" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.204176 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb816e33-9a16-4326-8058-c328fabcab45" (UID: "bb816e33-9a16-4326-8058-c328fabcab45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.246669 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.246706 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwbh4\" (UniqueName: \"kubernetes.io/projected/bb816e33-9a16-4326-8058-c328fabcab45-kube-api-access-fwbh4\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.246730 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.439178 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z6224"] Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.442947 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z6224"] Mar 11 12:14:14 crc kubenswrapper[4816]: I0311 12:14:14.138645 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb816e33-9a16-4326-8058-c328fabcab45" path="/var/lib/kubelet/pods/bb816e33-9a16-4326-8058-c328fabcab45/volumes" Mar 11 12:14:14 crc kubenswrapper[4816]: I0311 12:14:14.203223 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:14 crc kubenswrapper[4816]: I0311 12:14:14.203322 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:14 crc kubenswrapper[4816]: I0311 12:14:14.244900 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:15 crc kubenswrapper[4816]: I0311 12:14:15.166582 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.472208 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zsrdm"] Mar 11 12:14:16 crc kubenswrapper[4816]: E0311 12:14:16.473301 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="extract-utilities" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.473515 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="extract-utilities" Mar 11 12:14:16 crc kubenswrapper[4816]: E0311 12:14:16.473613 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="registry-server" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.473635 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="registry-server" Mar 11 12:14:16 crc kubenswrapper[4816]: E0311 12:14:16.473718 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8a107b-6295-42d4-b64b-7841171f67f3" containerName="oc" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.473784 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8a107b-6295-42d4-b64b-7841171f67f3" containerName="oc" Mar 11 12:14:16 crc kubenswrapper[4816]: E0311 12:14:16.473815 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="extract-utilities" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.473835 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="extract-utilities" Mar 11 12:14:16 crc kubenswrapper[4816]: E0311 12:14:16.473912 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="extract-content" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.473982 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="extract-content" Mar 11 12:14:16 crc kubenswrapper[4816]: E0311 12:14:16.474021 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="registry-server" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.474083 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="registry-server" Mar 11 12:14:16 crc kubenswrapper[4816]: E0311 12:14:16.474173 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="extract-content" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.474201 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="extract-content" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.474688 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8a107b-6295-42d4-b64b-7841171f67f3" containerName="oc" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.474739 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="registry-server" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.474772 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="registry-server" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.475768 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.478120 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.478244 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.478490 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-9g5kf" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.482854 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zsrdm"] Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.605532 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzx44\" (UniqueName: \"kubernetes.io/projected/4ed28d20-6f1f-4bb8-853d-284003a6b922-kube-api-access-gzx44\") pod \"openstack-operator-index-zsrdm\" (UID: \"4ed28d20-6f1f-4bb8-853d-284003a6b922\") " pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.707066 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzx44\" (UniqueName: \"kubernetes.io/projected/4ed28d20-6f1f-4bb8-853d-284003a6b922-kube-api-access-gzx44\") pod \"openstack-operator-index-zsrdm\" (UID: \"4ed28d20-6f1f-4bb8-853d-284003a6b922\") " pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.735629 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzx44\" (UniqueName: \"kubernetes.io/projected/4ed28d20-6f1f-4bb8-853d-284003a6b922-kube-api-access-gzx44\") pod \"openstack-operator-index-zsrdm\" (UID: \"4ed28d20-6f1f-4bb8-853d-284003a6b922\") " pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.804625 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:17 crc kubenswrapper[4816]: I0311 12:14:17.193767 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zsrdm"] Mar 11 12:14:18 crc kubenswrapper[4816]: I0311 12:14:18.147096 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zsrdm" event={"ID":"4ed28d20-6f1f-4bb8-853d-284003a6b922","Type":"ContainerStarted","Data":"bc47e16ddb2554c3a948cfe55fffa01c25b39ecc19919abe73f2cd6c26a10c85"} Mar 11 12:14:20 crc kubenswrapper[4816]: I0311 12:14:20.164931 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zsrdm" event={"ID":"4ed28d20-6f1f-4bb8-853d-284003a6b922","Type":"ContainerStarted","Data":"06547a216a99fb53c5fd48aab18860c55dca1b06c356f47e1deb51a7bcdcc0d7"} Mar 11 12:14:20 crc kubenswrapper[4816]: I0311 12:14:20.181094 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zsrdm" podStartSLOduration=2.340480781 podStartE2EDuration="4.18107326s" podCreationTimestamp="2026-03-11 12:14:16 +0000 UTC" firstStartedPulling="2026-03-11 12:14:17.202768582 +0000 UTC m=+943.794032549" lastFinishedPulling="2026-03-11 12:14:19.043361061 +0000 UTC m=+945.634625028" observedRunningTime="2026-03-11 12:14:20.177050421 +0000 UTC m=+946.768314408" watchObservedRunningTime="2026-03-11 12:14:20.18107326 +0000 UTC m=+946.772337237" Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.461879 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl4tp"] Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.462607 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hl4tp" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="registry-server" containerID="cri-o://8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11" gracePeriod=2 Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.835618 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.883371 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dg7l\" (UniqueName: \"kubernetes.io/projected/60a4785b-2b65-4a82-984d-611750e6e161-kube-api-access-2dg7l\") pod \"60a4785b-2b65-4a82-984d-611750e6e161\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.883756 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-catalog-content\") pod \"60a4785b-2b65-4a82-984d-611750e6e161\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.883875 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-utilities\") pod \"60a4785b-2b65-4a82-984d-611750e6e161\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.884869 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-utilities" (OuterVolumeSpecName: "utilities") pod "60a4785b-2b65-4a82-984d-611750e6e161" (UID: "60a4785b-2b65-4a82-984d-611750e6e161"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.885583 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.892624 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a4785b-2b65-4a82-984d-611750e6e161-kube-api-access-2dg7l" (OuterVolumeSpecName: "kube-api-access-2dg7l") pod "60a4785b-2b65-4a82-984d-611750e6e161" (UID: "60a4785b-2b65-4a82-984d-611750e6e161"). InnerVolumeSpecName "kube-api-access-2dg7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.909835 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60a4785b-2b65-4a82-984d-611750e6e161" (UID: "60a4785b-2b65-4a82-984d-611750e6e161"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.986624 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.986660 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dg7l\" (UniqueName: \"kubernetes.io/projected/60a4785b-2b65-4a82-984d-611750e6e161-kube-api-access-2dg7l\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.180582 4816 generic.go:334] "Generic (PLEG): container finished" podID="60a4785b-2b65-4a82-984d-611750e6e161" containerID="8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11" exitCode=0 Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.180630 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl4tp" event={"ID":"60a4785b-2b65-4a82-984d-611750e6e161","Type":"ContainerDied","Data":"8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11"} Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.180663 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl4tp" event={"ID":"60a4785b-2b65-4a82-984d-611750e6e161","Type":"ContainerDied","Data":"f45dd7f636b52fe2c1b2fb545c6a5ddc989520a45cd358b29620164b0db18c6d"} Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.180684 4816 scope.go:117] "RemoveContainer" containerID="8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.180816 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.207618 4816 scope.go:117] "RemoveContainer" containerID="e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.215198 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl4tp"] Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.220316 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl4tp"] Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.645137 4816 scope.go:117] "RemoveContainer" containerID="39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.678151 4816 scope.go:117] "RemoveContainer" containerID="8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11" Mar 11 12:14:22 crc kubenswrapper[4816]: E0311 12:14:22.679241 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11\": container with ID starting with 8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11 not found: ID does not exist" containerID="8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.679307 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11"} err="failed to get container status \"8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11\": rpc error: code = NotFound desc = could not find container \"8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11\": container with ID starting with 8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11 not found: ID does not exist" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.679339 4816 scope.go:117] "RemoveContainer" containerID="e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a" Mar 11 12:14:22 crc kubenswrapper[4816]: E0311 12:14:22.679689 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a\": container with ID starting with e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a not found: ID does not exist" containerID="e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.679731 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a"} err="failed to get container status \"e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a\": rpc error: code = NotFound desc = could not find container \"e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a\": container with ID starting with e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a not found: ID does not exist" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.679761 4816 scope.go:117] "RemoveContainer" containerID="39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3" Mar 11 12:14:22 crc kubenswrapper[4816]: E0311 12:14:22.680170 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3\": container with ID starting with 39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3 not found: ID does not exist" containerID="39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.680210 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3"} err="failed to get container status \"39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3\": rpc error: code = NotFound desc = could not find container \"39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3\": container with ID starting with 39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3 not found: ID does not exist" Mar 11 12:14:24 crc kubenswrapper[4816]: I0311 12:14:24.142883 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a4785b-2b65-4a82-984d-611750e6e161" path="/var/lib/kubelet/pods/60a4785b-2b65-4a82-984d-611750e6e161/volumes" Mar 11 12:14:26 crc kubenswrapper[4816]: I0311 12:14:26.805036 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:26 crc kubenswrapper[4816]: I0311 12:14:26.805308 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:26 crc kubenswrapper[4816]: I0311 12:14:26.856692 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:27 crc kubenswrapper[4816]: I0311 12:14:27.263544 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.115562 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp"] Mar 11 12:14:30 crc kubenswrapper[4816]: E0311 12:14:30.116183 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="extract-content" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.116196 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="extract-content" Mar 11 12:14:30 crc kubenswrapper[4816]: E0311 12:14:30.116218 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="registry-server" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.116224 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="registry-server" Mar 11 12:14:30 crc kubenswrapper[4816]: E0311 12:14:30.116240 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="extract-utilities" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.116280 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="extract-utilities" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.116408 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="registry-server" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.117236 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.120959 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-szh9h" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.141628 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp"] Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.199562 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.199620 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.199668 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc647\" (UniqueName: \"kubernetes.io/projected/b89bbc79-4a51-434d-916c-bf02869be9cb-kube-api-access-zc647\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.301236 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.301350 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc647\" (UniqueName: \"kubernetes.io/projected/b89bbc79-4a51-434d-916c-bf02869be9cb-kube-api-access-zc647\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.301462 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.301727 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.302051 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.321734 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc647\" (UniqueName: \"kubernetes.io/projected/b89bbc79-4a51-434d-916c-bf02869be9cb-kube-api-access-zc647\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.436922 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.762686 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp"] Mar 11 12:14:31 crc kubenswrapper[4816]: I0311 12:14:31.256617 4816 generic.go:334] "Generic (PLEG): container finished" podID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerID="7181d533daaec28ea763a1ea9b634ea77a11c32e37954305eff24aceadd1e6f0" exitCode=0 Mar 11 12:14:31 crc kubenswrapper[4816]: I0311 12:14:31.256674 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" event={"ID":"b89bbc79-4a51-434d-916c-bf02869be9cb","Type":"ContainerDied","Data":"7181d533daaec28ea763a1ea9b634ea77a11c32e37954305eff24aceadd1e6f0"} Mar 11 12:14:31 crc kubenswrapper[4816]: I0311 12:14:31.256743 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" event={"ID":"b89bbc79-4a51-434d-916c-bf02869be9cb","Type":"ContainerStarted","Data":"0f7ae4f3a64f0f17be7cd7cbfd6e70571aba9c54e9c7a8d2350c0a75f577adf3"} Mar 11 12:14:33 crc kubenswrapper[4816]: I0311 12:14:33.278215 4816 generic.go:334] "Generic (PLEG): container finished" podID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerID="0317c320a989f9d7e0d8233532a8d978d4493f5715d4fa8c266bef1645d03952" exitCode=0 Mar 11 12:14:33 crc kubenswrapper[4816]: I0311 12:14:33.278343 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" event={"ID":"b89bbc79-4a51-434d-916c-bf02869be9cb","Type":"ContainerDied","Data":"0317c320a989f9d7e0d8233532a8d978d4493f5715d4fa8c266bef1645d03952"} Mar 11 12:14:34 crc kubenswrapper[4816]: I0311 12:14:34.288669 4816 generic.go:334] "Generic (PLEG): container finished" podID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerID="3829554e987ca45394aac167f62c69b8a8354a92f096cc5380d4c5403050c8ed" exitCode=0 Mar 11 12:14:34 crc kubenswrapper[4816]: I0311 12:14:34.288718 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" event={"ID":"b89bbc79-4a51-434d-916c-bf02869be9cb","Type":"ContainerDied","Data":"3829554e987ca45394aac167f62c69b8a8354a92f096cc5380d4c5403050c8ed"} Mar 11 12:14:34 crc kubenswrapper[4816]: I0311 12:14:34.708411 4816 scope.go:117] "RemoveContainer" containerID="ce753e08f0c29759ce4abeca1c2ba4ffc8217be9eee018a375b073d4682d5231" Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.590362 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.696016 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-bundle\") pod \"b89bbc79-4a51-434d-916c-bf02869be9cb\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.696169 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-util\") pod \"b89bbc79-4a51-434d-916c-bf02869be9cb\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.696319 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc647\" (UniqueName: \"kubernetes.io/projected/b89bbc79-4a51-434d-916c-bf02869be9cb-kube-api-access-zc647\") pod \"b89bbc79-4a51-434d-916c-bf02869be9cb\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.696762 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-bundle" (OuterVolumeSpecName: "bundle") pod "b89bbc79-4a51-434d-916c-bf02869be9cb" (UID: "b89bbc79-4a51-434d-916c-bf02869be9cb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.696915 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.701893 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b89bbc79-4a51-434d-916c-bf02869be9cb-kube-api-access-zc647" (OuterVolumeSpecName: "kube-api-access-zc647") pod "b89bbc79-4a51-434d-916c-bf02869be9cb" (UID: "b89bbc79-4a51-434d-916c-bf02869be9cb"). InnerVolumeSpecName "kube-api-access-zc647". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.711610 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-util" (OuterVolumeSpecName: "util") pod "b89bbc79-4a51-434d-916c-bf02869be9cb" (UID: "b89bbc79-4a51-434d-916c-bf02869be9cb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.797901 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc647\" (UniqueName: \"kubernetes.io/projected/b89bbc79-4a51-434d-916c-bf02869be9cb-kube-api-access-zc647\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.797951 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-util\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:36 crc kubenswrapper[4816]: E0311 12:14:36.226282 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb89bbc79_4a51_434d_916c_bf02869be9cb.slice\": RecentStats: unable to find data in memory cache]" Mar 11 12:14:36 crc kubenswrapper[4816]: I0311 12:14:36.309486 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" event={"ID":"b89bbc79-4a51-434d-916c-bf02869be9cb","Type":"ContainerDied","Data":"0f7ae4f3a64f0f17be7cd7cbfd6e70571aba9c54e9c7a8d2350c0a75f577adf3"} Mar 11 12:14:36 crc kubenswrapper[4816]: I0311 12:14:36.309526 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f7ae4f3a64f0f17be7cd7cbfd6e70571aba9c54e9c7a8d2350c0a75f577adf3" Mar 11 12:14:36 crc kubenswrapper[4816]: I0311 12:14:36.310179 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:39 crc kubenswrapper[4816]: I0311 12:14:39.515100 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:14:39 crc kubenswrapper[4816]: I0311 12:14:39.515508 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.271301 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl"] Mar 11 12:14:42 crc kubenswrapper[4816]: E0311 12:14:42.272353 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerName="util" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.272455 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerName="util" Mar 11 12:14:42 crc kubenswrapper[4816]: E0311 12:14:42.272536 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerName="pull" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.272618 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerName="pull" Mar 11 12:14:42 crc kubenswrapper[4816]: E0311 12:14:42.272738 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerName="extract" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.272823 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerName="extract" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.273039 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerName="extract" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.273617 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.276810 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-hv4gd" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.353498 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl"] Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.398453 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b7lj\" (UniqueName: \"kubernetes.io/projected/0347df32-1ff0-463e-b073-077df8f41595-kube-api-access-9b7lj\") pod \"openstack-operator-controller-init-65b9994cf8-zz7rl\" (UID: \"0347df32-1ff0-463e-b073-077df8f41595\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.500116 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b7lj\" (UniqueName: \"kubernetes.io/projected/0347df32-1ff0-463e-b073-077df8f41595-kube-api-access-9b7lj\") pod \"openstack-operator-controller-init-65b9994cf8-zz7rl\" (UID: \"0347df32-1ff0-463e-b073-077df8f41595\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.520341 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b7lj\" (UniqueName: \"kubernetes.io/projected/0347df32-1ff0-463e-b073-077df8f41595-kube-api-access-9b7lj\") pod \"openstack-operator-controller-init-65b9994cf8-zz7rl\" (UID: \"0347df32-1ff0-463e-b073-077df8f41595\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.590695 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" Mar 11 12:14:43 crc kubenswrapper[4816]: I0311 12:14:43.045661 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl"] Mar 11 12:14:43 crc kubenswrapper[4816]: W0311 12:14:43.046776 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0347df32_1ff0_463e_b073_077df8f41595.slice/crio-2ca89b047ad118254203b9f9d0c2e99b5ef8214f1a22eab79f3ae66bebb3d17f WatchSource:0}: Error finding container 2ca89b047ad118254203b9f9d0c2e99b5ef8214f1a22eab79f3ae66bebb3d17f: Status 404 returned error can't find the container with id 2ca89b047ad118254203b9f9d0c2e99b5ef8214f1a22eab79f3ae66bebb3d17f Mar 11 12:14:43 crc kubenswrapper[4816]: I0311 12:14:43.354127 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" event={"ID":"0347df32-1ff0-463e-b073-077df8f41595","Type":"ContainerStarted","Data":"2ca89b047ad118254203b9f9d0c2e99b5ef8214f1a22eab79f3ae66bebb3d17f"} Mar 11 12:14:54 crc kubenswrapper[4816]: I0311 12:14:54.445368 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" event={"ID":"0347df32-1ff0-463e-b073-077df8f41595","Type":"ContainerStarted","Data":"1a2efe43b36e5cbc49391b95d744323354a32fe47b14aebca8806b630cc9062c"} Mar 11 12:14:54 crc kubenswrapper[4816]: I0311 12:14:54.446021 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" Mar 11 12:14:54 crc kubenswrapper[4816]: I0311 12:14:54.487949 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" podStartSLOduration=1.909275928 podStartE2EDuration="12.487931938s" podCreationTimestamp="2026-03-11 12:14:42 +0000 UTC" firstStartedPulling="2026-03-11 12:14:43.049315867 +0000 UTC m=+969.640579834" lastFinishedPulling="2026-03-11 12:14:53.627971827 +0000 UTC m=+980.219235844" observedRunningTime="2026-03-11 12:14:54.48558317 +0000 UTC m=+981.076847137" watchObservedRunningTime="2026-03-11 12:14:54.487931938 +0000 UTC m=+981.079195905" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.152537 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr"] Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.154711 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.156964 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.157685 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.165050 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr"] Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.297642 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a876e965-7c6d-4773-9c9b-f445411c559b-secret-volume\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.297715 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkstm\" (UniqueName: \"kubernetes.io/projected/a876e965-7c6d-4773-9c9b-f445411c559b-kube-api-access-lkstm\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.298258 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a876e965-7c6d-4773-9c9b-f445411c559b-config-volume\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.401217 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a876e965-7c6d-4773-9c9b-f445411c559b-config-volume\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.401382 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a876e965-7c6d-4773-9c9b-f445411c559b-secret-volume\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.401461 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkstm\" (UniqueName: \"kubernetes.io/projected/a876e965-7c6d-4773-9c9b-f445411c559b-kube-api-access-lkstm\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.402892 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a876e965-7c6d-4773-9c9b-f445411c559b-config-volume\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.416272 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a876e965-7c6d-4773-9c9b-f445411c559b-secret-volume\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.428493 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkstm\" (UniqueName: \"kubernetes.io/projected/a876e965-7c6d-4773-9c9b-f445411c559b-kube-api-access-lkstm\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.485745 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.712978 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr"] Mar 11 12:15:01 crc kubenswrapper[4816]: I0311 12:15:01.495853 4816 generic.go:334] "Generic (PLEG): container finished" podID="a876e965-7c6d-4773-9c9b-f445411c559b" containerID="f55a9848386a64adca827b95cdc172bd623f9f4d2757b50c73cba6bd74ab25e2" exitCode=0 Mar 11 12:15:01 crc kubenswrapper[4816]: I0311 12:15:01.495897 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" event={"ID":"a876e965-7c6d-4773-9c9b-f445411c559b","Type":"ContainerDied","Data":"f55a9848386a64adca827b95cdc172bd623f9f4d2757b50c73cba6bd74ab25e2"} Mar 11 12:15:01 crc kubenswrapper[4816]: I0311 12:15:01.495925 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" event={"ID":"a876e965-7c6d-4773-9c9b-f445411c559b","Type":"ContainerStarted","Data":"b0d101521d949bdb643e3f587d74f6788adbc883ab03a4afcd19669250b4364c"} Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.595324 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.793020 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.942546 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a876e965-7c6d-4773-9c9b-f445411c559b-secret-volume\") pod \"a876e965-7c6d-4773-9c9b-f445411c559b\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.943396 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkstm\" (UniqueName: \"kubernetes.io/projected/a876e965-7c6d-4773-9c9b-f445411c559b-kube-api-access-lkstm\") pod \"a876e965-7c6d-4773-9c9b-f445411c559b\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.943556 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a876e965-7c6d-4773-9c9b-f445411c559b-config-volume\") pod \"a876e965-7c6d-4773-9c9b-f445411c559b\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.944235 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a876e965-7c6d-4773-9c9b-f445411c559b-config-volume" (OuterVolumeSpecName: "config-volume") pod "a876e965-7c6d-4773-9c9b-f445411c559b" (UID: "a876e965-7c6d-4773-9c9b-f445411c559b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.950437 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a876e965-7c6d-4773-9c9b-f445411c559b-kube-api-access-lkstm" (OuterVolumeSpecName: "kube-api-access-lkstm") pod "a876e965-7c6d-4773-9c9b-f445411c559b" (UID: "a876e965-7c6d-4773-9c9b-f445411c559b"). InnerVolumeSpecName "kube-api-access-lkstm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.951899 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a876e965-7c6d-4773-9c9b-f445411c559b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a876e965-7c6d-4773-9c9b-f445411c559b" (UID: "a876e965-7c6d-4773-9c9b-f445411c559b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:15:03 crc kubenswrapper[4816]: I0311 12:15:03.045819 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkstm\" (UniqueName: \"kubernetes.io/projected/a876e965-7c6d-4773-9c9b-f445411c559b-kube-api-access-lkstm\") on node \"crc\" DevicePath \"\"" Mar 11 12:15:03 crc kubenswrapper[4816]: I0311 12:15:03.045892 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a876e965-7c6d-4773-9c9b-f445411c559b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:15:03 crc kubenswrapper[4816]: I0311 12:15:03.045919 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a876e965-7c6d-4773-9c9b-f445411c559b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:15:03 crc kubenswrapper[4816]: I0311 12:15:03.512161 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" event={"ID":"a876e965-7c6d-4773-9c9b-f445411c559b","Type":"ContainerDied","Data":"b0d101521d949bdb643e3f587d74f6788adbc883ab03a4afcd19669250b4364c"} Mar 11 12:15:03 crc kubenswrapper[4816]: I0311 12:15:03.512234 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0d101521d949bdb643e3f587d74f6788adbc883ab03a4afcd19669250b4364c" Mar 11 12:15:03 crc kubenswrapper[4816]: I0311 12:15:03.512308 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:09 crc kubenswrapper[4816]: I0311 12:15:09.515285 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:15:09 crc kubenswrapper[4816]: I0311 12:15:09.515814 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.909489 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228"] Mar 11 12:15:21 crc kubenswrapper[4816]: E0311 12:15:21.910333 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a876e965-7c6d-4773-9c9b-f445411c559b" containerName="collect-profiles" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.910347 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a876e965-7c6d-4773-9c9b-f445411c559b" containerName="collect-profiles" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.910449 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a876e965-7c6d-4773-9c9b-f445411c559b" containerName="collect-profiles" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.910841 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.913897 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-4cxnm" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.918281 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2"] Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.919061 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.921465 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-sj8rq" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.923038 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228"] Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.934749 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4"] Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.937691 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.942923 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gg2r5" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.966538 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4"] Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.971920 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2"] Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.983706 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm"] Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.989373 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.009704 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4k7cn" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.020675 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.031059 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhngz\" (UniqueName: \"kubernetes.io/projected/a8133b64-eb11-43ad-bf6e-a278af0ff466-kube-api-access-xhngz\") pod \"barbican-operator-controller-manager-677bd678f7-rb228\" (UID: \"a8133b64-eb11-43ad-bf6e-a278af0ff466\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.031353 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv98c\" (UniqueName: \"kubernetes.io/projected/6311ca5f-6f4c-4768-ae5e-75128be7f589-kube-api-access-xv98c\") pod \"cinder-operator-controller-manager-984cd4dcf-g8cg2\" (UID: \"6311ca5f-6f4c-4768-ae5e-75128be7f589\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.044325 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.045359 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.048772 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xzvbx" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.052449 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.053369 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.054997 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-rh87f" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.057389 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.067609 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.082864 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.083639 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.088040 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.089231 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vxmbd" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.090962 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.113113 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.114043 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.118092 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7q6t8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.119105 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.120012 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.123674 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2vcjw" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.123805 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.132819 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmxqs\" (UniqueName: \"kubernetes.io/projected/c28c6622-633e-4e76-9c9a-eb732531fa1a-kube-api-access-jmxqs\") pod \"glance-operator-controller-manager-5964f64c48-px2wm\" (UID: \"c28c6622-633e-4e76-9c9a-eb732531fa1a\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.132864 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv98c\" (UniqueName: \"kubernetes.io/projected/6311ca5f-6f4c-4768-ae5e-75128be7f589-kube-api-access-xv98c\") pod \"cinder-operator-controller-manager-984cd4dcf-g8cg2\" (UID: \"6311ca5f-6f4c-4768-ae5e-75128be7f589\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.132913 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhngz\" (UniqueName: \"kubernetes.io/projected/a8133b64-eb11-43ad-bf6e-a278af0ff466-kube-api-access-xhngz\") pod \"barbican-operator-controller-manager-677bd678f7-rb228\" (UID: \"a8133b64-eb11-43ad-bf6e-a278af0ff466\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.132976 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szv9g\" (UniqueName: \"kubernetes.io/projected/72237264-5d09-40bd-ba83-f30b76790cb6-kube-api-access-szv9g\") pod \"designate-operator-controller-manager-66d56f6ff4-fjkn4\" (UID: \"72237264-5d09-40bd-ba83-f30b76790cb6\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.161801 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhngz\" (UniqueName: \"kubernetes.io/projected/a8133b64-eb11-43ad-bf6e-a278af0ff466-kube-api-access-xhngz\") pod \"barbican-operator-controller-manager-677bd678f7-rb228\" (UID: \"a8133b64-eb11-43ad-bf6e-a278af0ff466\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.165923 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.166792 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.166823 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.167469 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.168068 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.168175 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.168740 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.169072 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.171658 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-lng6s" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.171871 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-b4h4x" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.175331 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.181459 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv98c\" (UniqueName: \"kubernetes.io/projected/6311ca5f-6f4c-4768-ae5e-75128be7f589-kube-api-access-xv98c\") pod \"cinder-operator-controller-manager-984cd4dcf-g8cg2\" (UID: \"6311ca5f-6f4c-4768-ae5e-75128be7f589\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.181947 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vlwnm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.217081 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234124 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mskc2\" (UniqueName: \"kubernetes.io/projected/f37fb9b3-7b07-4188-b9ea-facfa5e945f0-kube-api-access-mskc2\") pod \"ironic-operator-controller-manager-6bbb499bbc-874hd\" (UID: \"f37fb9b3-7b07-4188-b9ea-facfa5e945f0\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234206 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ws7\" (UniqueName: \"kubernetes.io/projected/b941b0f1-4a8f-4517-af46-cc77892fe3d9-kube-api-access-v8ws7\") pod \"heat-operator-controller-manager-77b6666d85-66ctj\" (UID: \"b941b0f1-4a8f-4517-af46-cc77892fe3d9\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234276 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szv9g\" (UniqueName: \"kubernetes.io/projected/72237264-5d09-40bd-ba83-f30b76790cb6-kube-api-access-szv9g\") pod \"designate-operator-controller-manager-66d56f6ff4-fjkn4\" (UID: \"72237264-5d09-40bd-ba83-f30b76790cb6\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234317 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmxqs\" (UniqueName: \"kubernetes.io/projected/c28c6622-633e-4e76-9c9a-eb732531fa1a-kube-api-access-jmxqs\") pod \"glance-operator-controller-manager-5964f64c48-px2wm\" (UID: \"c28c6622-633e-4e76-9c9a-eb732531fa1a\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234352 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkqph\" (UniqueName: \"kubernetes.io/projected/9e0c8832-9c20-44a9-933c-4a7fff032367-kube-api-access-vkqph\") pod \"horizon-operator-controller-manager-6d9d6b584d-8v46x\" (UID: \"9e0c8832-9c20-44a9-933c-4a7fff032367\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234387 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhgkz\" (UniqueName: \"kubernetes.io/projected/73e00d02-6599-4cab-a32b-8fe96b82951a-kube-api-access-nhgkz\") pod \"keystone-operator-controller-manager-684f77d66d-zczdq\" (UID: \"73e00d02-6599-4cab-a32b-8fe96b82951a\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234408 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khvb9\" (UniqueName: \"kubernetes.io/projected/a605e964-6e3c-4639-95d5-908f5d0ab7ef-kube-api-access-khvb9\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234433 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.241807 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.242638 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.243603 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.248334 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-4bdp8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.249095 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.257498 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.258431 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.260713 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-zwd24" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.277802 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szv9g\" (UniqueName: \"kubernetes.io/projected/72237264-5d09-40bd-ba83-f30b76790cb6-kube-api-access-szv9g\") pod \"designate-operator-controller-manager-66d56f6ff4-fjkn4\" (UID: \"72237264-5d09-40bd-ba83-f30b76790cb6\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.279337 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.284140 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmxqs\" (UniqueName: \"kubernetes.io/projected/c28c6622-633e-4e76-9c9a-eb732531fa1a-kube-api-access-jmxqs\") pod \"glance-operator-controller-manager-5964f64c48-px2wm\" (UID: \"c28c6622-633e-4e76-9c9a-eb732531fa1a\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.284493 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.286964 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.297182 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.298104 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.301445 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-mnct8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.333622 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.334671 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.334822 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.336483 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338042 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hhjl\" (UniqueName: \"kubernetes.io/projected/5d318732-8194-49eb-a2a3-c5b13ce843a7-kube-api-access-5hhjl\") pod \"mariadb-operator-controller-manager-658d4cdd5-wnsst\" (UID: \"5d318732-8194-49eb-a2a3-c5b13ce843a7\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338068 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hczr6\" (UniqueName: \"kubernetes.io/projected/bcfe1f90-2b5f-43b7-b798-0bad62ec53b2-kube-api-access-hczr6\") pod \"manila-operator-controller-manager-68f45f9d9f-bl9hm\" (UID: \"bcfe1f90-2b5f-43b7-b798-0bad62ec53b2\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338098 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ws7\" (UniqueName: \"kubernetes.io/projected/b941b0f1-4a8f-4517-af46-cc77892fe3d9-kube-api-access-v8ws7\") pod \"heat-operator-controller-manager-77b6666d85-66ctj\" (UID: \"b941b0f1-4a8f-4517-af46-cc77892fe3d9\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338131 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b9sn\" (UniqueName: \"kubernetes.io/projected/b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5-kube-api-access-9b9sn\") pod \"octavia-operator-controller-manager-5f4f55cb5c-56fsw\" (UID: \"b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkqph\" (UniqueName: \"kubernetes.io/projected/9e0c8832-9c20-44a9-933c-4a7fff032367-kube-api-access-vkqph\") pod \"horizon-operator-controller-manager-6d9d6b584d-8v46x\" (UID: \"9e0c8832-9c20-44a9-933c-4a7fff032367\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338201 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhgkz\" (UniqueName: \"kubernetes.io/projected/73e00d02-6599-4cab-a32b-8fe96b82951a-kube-api-access-nhgkz\") pod \"keystone-operator-controller-manager-684f77d66d-zczdq\" (UID: \"73e00d02-6599-4cab-a32b-8fe96b82951a\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338221 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khvb9\" (UniqueName: \"kubernetes.io/projected/a605e964-6e3c-4639-95d5-908f5d0ab7ef-kube-api-access-khvb9\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338659 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338792 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58dj6\" (UniqueName: \"kubernetes.io/projected/4d4c74ff-52a2-4426-bd06-daa6e9b1a832-kube-api-access-58dj6\") pod \"neutron-operator-controller-manager-776c5696bf-h2vmc\" (UID: \"4d4c74ff-52a2-4426-bd06-daa6e9b1a832\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338860 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mskc2\" (UniqueName: \"kubernetes.io/projected/f37fb9b3-7b07-4188-b9ea-facfa5e945f0-kube-api-access-mskc2\") pod \"ironic-operator-controller-manager-6bbb499bbc-874hd\" (UID: \"f37fb9b3-7b07-4188-b9ea-facfa5e945f0\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.338935 4816 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.339017 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert podName:a605e964-6e3c-4639-95d5-908f5d0ab7ef nodeName:}" failed. No retries permitted until 2026-03-11 12:15:22.838994391 +0000 UTC m=+1009.430258358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert") pod "infra-operator-controller-manager-5995f4446f-hzd9q" (UID: "a605e964-6e3c-4639-95d5-908f5d0ab7ef") : secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.343760 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.344019 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fps72" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.344325 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-8r4xj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.353837 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.360085 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mskc2\" (UniqueName: \"kubernetes.io/projected/f37fb9b3-7b07-4188-b9ea-facfa5e945f0-kube-api-access-mskc2\") pod \"ironic-operator-controller-manager-6bbb499bbc-874hd\" (UID: \"f37fb9b3-7b07-4188-b9ea-facfa5e945f0\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.362695 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkqph\" (UniqueName: \"kubernetes.io/projected/9e0c8832-9c20-44a9-933c-4a7fff032367-kube-api-access-vkqph\") pod \"horizon-operator-controller-manager-6d9d6b584d-8v46x\" (UID: \"9e0c8832-9c20-44a9-933c-4a7fff032367\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.365077 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ws7\" (UniqueName: \"kubernetes.io/projected/b941b0f1-4a8f-4517-af46-cc77892fe3d9-kube-api-access-v8ws7\") pod \"heat-operator-controller-manager-77b6666d85-66ctj\" (UID: \"b941b0f1-4a8f-4517-af46-cc77892fe3d9\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.365179 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.368663 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhgkz\" (UniqueName: \"kubernetes.io/projected/73e00d02-6599-4cab-a32b-8fe96b82951a-kube-api-access-nhgkz\") pod \"keystone-operator-controller-manager-684f77d66d-zczdq\" (UID: \"73e00d02-6599-4cab-a32b-8fe96b82951a\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.368723 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.375848 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khvb9\" (UniqueName: \"kubernetes.io/projected/a605e964-6e3c-4639-95d5-908f5d0ab7ef-kube-api-access-khvb9\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.401316 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.406985 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.432664 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.450744 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.453991 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gztj\" (UniqueName: \"kubernetes.io/projected/d1702062-37ba-43c0-becb-005e11f457a0-kube-api-access-4gztj\") pod \"nova-operator-controller-manager-569cc54c5-rxhkb\" (UID: \"d1702062-37ba-43c0-becb-005e11f457a0\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454027 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58dj6\" (UniqueName: \"kubernetes.io/projected/4d4c74ff-52a2-4426-bd06-daa6e9b1a832-kube-api-access-58dj6\") pod \"neutron-operator-controller-manager-776c5696bf-h2vmc\" (UID: \"4d4c74ff-52a2-4426-bd06-daa6e9b1a832\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454061 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hhjl\" (UniqueName: \"kubernetes.io/projected/5d318732-8194-49eb-a2a3-c5b13ce843a7-kube-api-access-5hhjl\") pod \"mariadb-operator-controller-manager-658d4cdd5-wnsst\" (UID: \"5d318732-8194-49eb-a2a3-c5b13ce843a7\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454078 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hczr6\" (UniqueName: \"kubernetes.io/projected/bcfe1f90-2b5f-43b7-b798-0bad62ec53b2-kube-api-access-hczr6\") pod \"manila-operator-controller-manager-68f45f9d9f-bl9hm\" (UID: \"bcfe1f90-2b5f-43b7-b798-0bad62ec53b2\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454111 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b9sn\" (UniqueName: \"kubernetes.io/projected/b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5-kube-api-access-9b9sn\") pod \"octavia-operator-controller-manager-5f4f55cb5c-56fsw\" (UID: \"b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454141 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h9rq\" (UniqueName: \"kubernetes.io/projected/e04ad395-8120-4c57-8575-611fa438e8fb-kube-api-access-9h9rq\") pod \"placement-operator-controller-manager-574d45c66c-h7kgb\" (UID: \"e04ad395-8120-4c57-8575-611fa438e8fb\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454169 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnvxf\" (UniqueName: \"kubernetes.io/projected/78a7aebd-70a2-4608-a669-aea496cb6186-kube-api-access-pnvxf\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454189 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454219 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s9x8\" (UniqueName: \"kubernetes.io/projected/6bbceab2-fe2b-4693-867d-aa2a51261611-kube-api-access-2s9x8\") pod \"ovn-operator-controller-manager-bbc5b68f9-rr62t\" (UID: \"6bbceab2-fe2b-4693-867d-aa2a51261611\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.458439 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-426qz"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.459323 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.461636 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.474766 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-cm6q7" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.489017 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58dj6\" (UniqueName: \"kubernetes.io/projected/4d4c74ff-52a2-4426-bd06-daa6e9b1a832-kube-api-access-58dj6\") pod \"neutron-operator-controller-manager-776c5696bf-h2vmc\" (UID: \"4d4c74ff-52a2-4426-bd06-daa6e9b1a832\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.514745 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b9sn\" (UniqueName: \"kubernetes.io/projected/b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5-kube-api-access-9b9sn\") pod \"octavia-operator-controller-manager-5f4f55cb5c-56fsw\" (UID: \"b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.515020 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hhjl\" (UniqueName: \"kubernetes.io/projected/5d318732-8194-49eb-a2a3-c5b13ce843a7-kube-api-access-5hhjl\") pod \"mariadb-operator-controller-manager-658d4cdd5-wnsst\" (UID: \"5d318732-8194-49eb-a2a3-c5b13ce843a7\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.516511 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-426qz"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.519046 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hczr6\" (UniqueName: \"kubernetes.io/projected/bcfe1f90-2b5f-43b7-b798-0bad62ec53b2-kube-api-access-hczr6\") pod \"manila-operator-controller-manager-68f45f9d9f-bl9hm\" (UID: \"bcfe1f90-2b5f-43b7-b798-0bad62ec53b2\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.533678 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.552632 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.553857 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.555277 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h9rq\" (UniqueName: \"kubernetes.io/projected/e04ad395-8120-4c57-8575-611fa438e8fb-kube-api-access-9h9rq\") pod \"placement-operator-controller-manager-574d45c66c-h7kgb\" (UID: \"e04ad395-8120-4c57-8575-611fa438e8fb\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.555335 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnvxf\" (UniqueName: \"kubernetes.io/projected/78a7aebd-70a2-4608-a669-aea496cb6186-kube-api-access-pnvxf\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.555364 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.555401 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s9x8\" (UniqueName: \"kubernetes.io/projected/6bbceab2-fe2b-4693-867d-aa2a51261611-kube-api-access-2s9x8\") pod \"ovn-operator-controller-manager-bbc5b68f9-rr62t\" (UID: \"6bbceab2-fe2b-4693-867d-aa2a51261611\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.555446 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gztj\" (UniqueName: \"kubernetes.io/projected/d1702062-37ba-43c0-becb-005e11f457a0-kube-api-access-4gztj\") pod \"nova-operator-controller-manager-569cc54c5-rxhkb\" (UID: \"d1702062-37ba-43c0-becb-005e11f457a0\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.556215 4816 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.556278 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert podName:78a7aebd-70a2-4608-a669-aea496cb6186 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:23.056260493 +0000 UTC m=+1009.647524460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" (UID: "78a7aebd-70a2-4608-a669-aea496cb6186") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.559472 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-7f9j2" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.584544 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h9rq\" (UniqueName: \"kubernetes.io/projected/e04ad395-8120-4c57-8575-611fa438e8fb-kube-api-access-9h9rq\") pod \"placement-operator-controller-manager-574d45c66c-h7kgb\" (UID: \"e04ad395-8120-4c57-8575-611fa438e8fb\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.590702 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnvxf\" (UniqueName: \"kubernetes.io/projected/78a7aebd-70a2-4608-a669-aea496cb6186-kube-api-access-pnvxf\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.592848 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gztj\" (UniqueName: \"kubernetes.io/projected/d1702062-37ba-43c0-becb-005e11f457a0-kube-api-access-4gztj\") pod \"nova-operator-controller-manager-569cc54c5-rxhkb\" (UID: \"d1702062-37ba-43c0-becb-005e11f457a0\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.593574 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s9x8\" (UniqueName: \"kubernetes.io/projected/6bbceab2-fe2b-4693-867d-aa2a51261611-kube-api-access-2s9x8\") pod \"ovn-operator-controller-manager-bbc5b68f9-rr62t\" (UID: \"6bbceab2-fe2b-4693-867d-aa2a51261611\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.599118 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.654055 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.664494 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.667425 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.663023 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m9n7\" (UniqueName: \"kubernetes.io/projected/d7932403-615f-44e4-b195-4a83c19787ba-kube-api-access-4m9n7\") pod \"swift-operator-controller-manager-677c674df7-426qz\" (UID: \"d7932403-615f-44e4-b195-4a83c19787ba\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.667934 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxcrf\" (UniqueName: \"kubernetes.io/projected/0ddf91ff-6d91-4213-8032-05f80408063d-kube-api-access-mxcrf\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-7ldx8\" (UID: \"0ddf91ff-6d91-4213-8032-05f80408063d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.668556 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.669734 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-zmq6m" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.675601 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.691155 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.710327 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.718568 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.719735 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.722953 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-vx4jk" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.735038 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.739543 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.761578 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.769074 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2b8h\" (UniqueName: \"kubernetes.io/projected/4126be7d-7ca8-4e68-94d4-ea21644fbd85-kube-api-access-q2b8h\") pod \"watcher-operator-controller-manager-6dd88c6f67-kx9nz\" (UID: \"4126be7d-7ca8-4e68-94d4-ea21644fbd85\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.769183 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m9n7\" (UniqueName: \"kubernetes.io/projected/d7932403-615f-44e4-b195-4a83c19787ba-kube-api-access-4m9n7\") pod \"swift-operator-controller-manager-677c674df7-426qz\" (UID: \"d7932403-615f-44e4-b195-4a83c19787ba\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.769222 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxcrf\" (UniqueName: \"kubernetes.io/projected/0ddf91ff-6d91-4213-8032-05f80408063d-kube-api-access-mxcrf\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-7ldx8\" (UID: \"0ddf91ff-6d91-4213-8032-05f80408063d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.769326 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgnwd\" (UniqueName: \"kubernetes.io/projected/282f8f05-9a84-4bb4-a122-ba8806324ca3-kube-api-access-sgnwd\") pod \"test-operator-controller-manager-5c5cb9c4d7-k2rnj\" (UID: \"282f8f05-9a84-4bb4-a122-ba8806324ca3\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.784977 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.785853 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.789215 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxcrf\" (UniqueName: \"kubernetes.io/projected/0ddf91ff-6d91-4213-8032-05f80408063d-kube-api-access-mxcrf\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-7ldx8\" (UID: \"0ddf91ff-6d91-4213-8032-05f80408063d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.789540 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vwbwj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.789766 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.790015 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.801278 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.809895 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m9n7\" (UniqueName: \"kubernetes.io/projected/d7932403-615f-44e4-b195-4a83c19787ba-kube-api-access-4m9n7\") pod \"swift-operator-controller-manager-677c674df7-426qz\" (UID: \"d7932403-615f-44e4-b195-4a83c19787ba\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.819026 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.819985 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.823823 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.824493 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-l88nf" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.824590 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.845241 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.872621 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.872682 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.872746 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kcq2\" (UniqueName: \"kubernetes.io/projected/5f4b0b09-5704-432a-9cd4-82a296f3c467-kube-api-access-4kcq2\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.879058 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgnwd\" (UniqueName: \"kubernetes.io/projected/282f8f05-9a84-4bb4-a122-ba8806324ca3-kube-api-access-sgnwd\") pod \"test-operator-controller-manager-5c5cb9c4d7-k2rnj\" (UID: \"282f8f05-9a84-4bb4-a122-ba8806324ca3\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.879295 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j895\" (UniqueName: \"kubernetes.io/projected/8e810ef6-d3f5-4133-bce2-234df32b3d10-kube-api-access-8j895\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dnqpf\" (UID: \"8e810ef6-d3f5-4133-bce2-234df32b3d10\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.879410 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2b8h\" (UniqueName: \"kubernetes.io/projected/4126be7d-7ca8-4e68-94d4-ea21644fbd85-kube-api-access-q2b8h\") pod \"watcher-operator-controller-manager-6dd88c6f67-kx9nz\" (UID: \"4126be7d-7ca8-4e68-94d4-ea21644fbd85\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.879491 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.879814 4816 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.879920 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert podName:a605e964-6e3c-4639-95d5-908f5d0ab7ef nodeName:}" failed. No retries permitted until 2026-03-11 12:15:23.879900864 +0000 UTC m=+1010.471164831 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert") pod "infra-operator-controller-manager-5995f4446f-hzd9q" (UID: "a605e964-6e3c-4639-95d5-908f5d0ab7ef") : secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.922130 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2b8h\" (UniqueName: \"kubernetes.io/projected/4126be7d-7ca8-4e68-94d4-ea21644fbd85-kube-api-access-q2b8h\") pod \"watcher-operator-controller-manager-6dd88c6f67-kx9nz\" (UID: \"4126be7d-7ca8-4e68-94d4-ea21644fbd85\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.923865 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgnwd\" (UniqueName: \"kubernetes.io/projected/282f8f05-9a84-4bb4-a122-ba8806324ca3-kube-api-access-sgnwd\") pod \"test-operator-controller-manager-5c5cb9c4d7-k2rnj\" (UID: \"282f8f05-9a84-4bb4-a122-ba8806324ca3\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.983073 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j895\" (UniqueName: \"kubernetes.io/projected/8e810ef6-d3f5-4133-bce2-234df32b3d10-kube-api-access-8j895\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dnqpf\" (UID: \"8e810ef6-d3f5-4133-bce2-234df32b3d10\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.983206 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.983237 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.983316 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kcq2\" (UniqueName: \"kubernetes.io/projected/5f4b0b09-5704-432a-9cd4-82a296f3c467-kube-api-access-4kcq2\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.983461 4816 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.983546 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:23.483522024 +0000 UTC m=+1010.074785991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.983725 4816 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.983789 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:23.483769111 +0000 UTC m=+1010.075033178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "metrics-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.008550 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j895\" (UniqueName: \"kubernetes.io/projected/8e810ef6-d3f5-4133-bce2-234df32b3d10-kube-api-access-8j895\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dnqpf\" (UID: \"8e810ef6-d3f5-4133-bce2-234df32b3d10\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.034160 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kcq2\" (UniqueName: \"kubernetes.io/projected/5f4b0b09-5704-432a-9cd4-82a296f3c467-kube-api-access-4kcq2\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.059734 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.087424 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.087581 4816 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.087648 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert podName:78a7aebd-70a2-4608-a669-aea496cb6186 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:24.087627898 +0000 UTC m=+1010.678891865 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" (UID: "78a7aebd-70a2-4608-a669-aea496cb6186") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.153462 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.209902 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.261568 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2"] Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.497392 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.497437 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.497599 4816 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.497644 4816 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.497703 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:24.497679821 +0000 UTC m=+1011.088943878 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "webhook-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.497721 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:24.497715572 +0000 UTC m=+1011.088979539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "metrics-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.676013 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" event={"ID":"6311ca5f-6f4c-4768-ae5e-75128be7f589","Type":"ContainerStarted","Data":"42f76418da9ac1dd9f3d67f38f034aad5fbc9040ac8a8efb960cc9e438b73c1b"} Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.702567 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm"] Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.742881 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228"] Mar 11 12:15:23 crc kubenswrapper[4816]: W0311 12:15:23.747224 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73e00d02_6599_4cab_a32b_8fe96b82951a.slice/crio-816296e43888e78972adac3a0612fc0bc7ecda9b63b064b7f2079b767a8333a6 WatchSource:0}: Error finding container 816296e43888e78972adac3a0612fc0bc7ecda9b63b064b7f2079b767a8333a6: Status 404 returned error can't find the container with id 816296e43888e78972adac3a0612fc0bc7ecda9b63b064b7f2079b767a8333a6 Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.751547 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj"] Mar 11 12:15:23 crc kubenswrapper[4816]: W0311 12:15:23.754263 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8133b64_eb11_43ad_bf6e_a278af0ff466.slice/crio-7c0edc922c0f570ddf7e4782d583c20d7a06f05d79e9d1d10972db14503d209a WatchSource:0}: Error finding container 7c0edc922c0f570ddf7e4782d583c20d7a06f05d79e9d1d10972db14503d209a: Status 404 returned error can't find the container with id 7c0edc922c0f570ddf7e4782d583c20d7a06f05d79e9d1d10972db14503d209a Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.765057 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq"] Mar 11 12:15:23 crc kubenswrapper[4816]: W0311 12:15:23.768832 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e0c8832_9c20_44a9_933c_4a7fff032367.slice/crio-ee1374418873ee1ef0528d7e6e082f837d5a907a0a7537ec81d120ffb7351a25 WatchSource:0}: Error finding container ee1374418873ee1ef0528d7e6e082f837d5a907a0a7537ec81d120ffb7351a25: Status 404 returned error can't find the container with id ee1374418873ee1ef0528d7e6e082f837d5a907a0a7537ec81d120ffb7351a25 Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.781711 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x"] Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.788564 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4"] Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.908083 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.908306 4816 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.908371 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert podName:a605e964-6e3c-4639-95d5-908f5d0ab7ef nodeName:}" failed. No retries permitted until 2026-03-11 12:15:25.908342691 +0000 UTC m=+1012.499606658 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert") pod "infra-operator-controller-manager-5995f4446f-hzd9q" (UID: "a605e964-6e3c-4639-95d5-908f5d0ab7ef") : secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.007634 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.102583 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.111890 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.112088 4816 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.112151 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert podName:78a7aebd-70a2-4608-a669-aea496cb6186 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:26.112137461 +0000 UTC m=+1012.703401428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" (UID: "78a7aebd-70a2-4608-a669-aea496cb6186") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.128712 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd"] Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.149983 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sgnwd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-k2rnj_openstack-operators(282f8f05-9a84-4bb4-a122-ba8806324ca3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.151080 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" podUID="282f8f05-9a84-4bb4-a122-ba8806324ca3" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.155017 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf"] Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.162197 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5hhjl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-658d4cdd5-wnsst_openstack-operators(5d318732-8194-49eb-a2a3-c5b13ce843a7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.164164 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" podUID="5d318732-8194-49eb-a2a3-c5b13ce843a7" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.167592 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj"] Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.184396 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2s9x8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-rr62t_openstack-operators(6bbceab2-fe2b-4693-867d-aa2a51261611): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.185533 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" podUID="6bbceab2-fe2b-4693-867d-aa2a51261611" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.188528 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4m9n7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-426qz_openstack-operators(d7932403-615f-44e4-b195-4a83c19787ba): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.191072 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" podUID="d7932403-615f-44e4-b195-4a83c19787ba" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.194355 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.204664 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw"] Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.217201 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mxcrf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-7ldx8_openstack-operators(0ddf91ff-6d91-4213-8032-05f80408063d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.219496 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" podUID="0ddf91ff-6d91-4213-8032-05f80408063d" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.225938 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst"] Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.233119 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4gztj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-569cc54c5-rxhkb_openstack-operators(d1702062-37ba-43c0-becb-005e11f457a0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.234444 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" podUID="d1702062-37ba-43c0-becb-005e11f457a0" Mar 11 12:15:24 crc kubenswrapper[4816]: W0311 12:15:24.247062 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4126be7d_7ca8_4e68_94d4_ea21644fbd85.slice/crio-103e18b3f96a7836490015b2ceb6df583489c99c24580345d3b7e471ea1806f6 WatchSource:0}: Error finding container 103e18b3f96a7836490015b2ceb6df583489c99c24580345d3b7e471ea1806f6: Status 404 returned error can't find the container with id 103e18b3f96a7836490015b2ceb6df583489c99c24580345d3b7e471ea1806f6 Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.248706 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q2b8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-kx9nz_openstack-operators(4126be7d-7ca8-4e68-94d4-ea21644fbd85): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.249945 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" podUID="4126be7d-7ca8-4e68-94d4-ea21644fbd85" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.255765 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-426qz"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.262294 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.284217 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.290734 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.303868 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.516678 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.516716 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.516907 4816 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.516948 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:26.516935761 +0000 UTC m=+1013.108199728 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "metrics-server-cert" not found Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.517273 4816 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.517299 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:26.517291941 +0000 UTC m=+1013.108555908 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "webhook-server-cert" not found Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.693728 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" event={"ID":"4126be7d-7ca8-4e68-94d4-ea21644fbd85","Type":"ContainerStarted","Data":"103e18b3f96a7836490015b2ceb6df583489c99c24580345d3b7e471ea1806f6"} Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.695664 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" podUID="4126be7d-7ca8-4e68-94d4-ea21644fbd85" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.700403 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" event={"ID":"e04ad395-8120-4c57-8575-611fa438e8fb","Type":"ContainerStarted","Data":"715e3d19002d44a4329be61357bbba9352a154017d1233e9cda5ae9c4fdd9256"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.702166 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" event={"ID":"8e810ef6-d3f5-4133-bce2-234df32b3d10","Type":"ContainerStarted","Data":"f8c717263b0e9d2e0f927e56f7842d64fa39521a9a93cac24913b3dc0e2bbb4a"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.704725 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" event={"ID":"a8133b64-eb11-43ad-bf6e-a278af0ff466","Type":"ContainerStarted","Data":"7c0edc922c0f570ddf7e4782d583c20d7a06f05d79e9d1d10972db14503d209a"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.707084 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" event={"ID":"b941b0f1-4a8f-4517-af46-cc77892fe3d9","Type":"ContainerStarted","Data":"275604cb58d94400d609852fdcfcbe587b6108308bb7f6979d39e1543e6a9201"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.712913 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" event={"ID":"d1702062-37ba-43c0-becb-005e11f457a0","Type":"ContainerStarted","Data":"a1c332a8f9f06d000e09778e625d18d473748e76952ca22651d101f601022583"} Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.715845 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922\\\"\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" podUID="d1702062-37ba-43c0-becb-005e11f457a0" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.721658 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" event={"ID":"6bbceab2-fe2b-4693-867d-aa2a51261611","Type":"ContainerStarted","Data":"303c0f9283e81aeac2e6d6ab0e4ac3deefc1337a027f284b6b5e1e71c0f2e3d0"} Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.724171 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" podUID="6bbceab2-fe2b-4693-867d-aa2a51261611" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.727572 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" event={"ID":"bcfe1f90-2b5f-43b7-b798-0bad62ec53b2","Type":"ContainerStarted","Data":"3eea47678ede611c1001e6a7147d6e4c48fe5edb7f89235bfe1984abac1fb4e8"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.739985 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" event={"ID":"9e0c8832-9c20-44a9-933c-4a7fff032367","Type":"ContainerStarted","Data":"ee1374418873ee1ef0528d7e6e082f837d5a907a0a7537ec81d120ffb7351a25"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.743390 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" event={"ID":"73e00d02-6599-4cab-a32b-8fe96b82951a","Type":"ContainerStarted","Data":"816296e43888e78972adac3a0612fc0bc7ecda9b63b064b7f2079b767a8333a6"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.747483 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" event={"ID":"d7932403-615f-44e4-b195-4a83c19787ba","Type":"ContainerStarted","Data":"f10441d7d93b347d0276856ef7fdf2b6fed8a2030f74252fcac16c7a4aa73254"} Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.749144 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" podUID="d7932403-615f-44e4-b195-4a83c19787ba" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.750115 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" event={"ID":"282f8f05-9a84-4bb4-a122-ba8806324ca3","Type":"ContainerStarted","Data":"f3332faccfc6d1c0624878dd32cf7b5036ae673b0b5f6a4282615ba11799463b"} Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.751363 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" podUID="282f8f05-9a84-4bb4-a122-ba8806324ca3" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.752875 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" event={"ID":"f37fb9b3-7b07-4188-b9ea-facfa5e945f0","Type":"ContainerStarted","Data":"960f1ff05d90dddfdf3807e6f61172bfd8cdcf0879ca635bac9a60c26d2fc27a"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.758019 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" event={"ID":"4d4c74ff-52a2-4426-bd06-daa6e9b1a832","Type":"ContainerStarted","Data":"1d9ba18a437f5f176c6134140198f809d2744bc517a548e8d0f0abe1739bccbd"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.763468 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" event={"ID":"c28c6622-633e-4e76-9c9a-eb732531fa1a","Type":"ContainerStarted","Data":"c72e7e43d63a0638bc9f93a540d4759ca3542365a1e02bdfa35caf9bb41150ef"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.765270 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" event={"ID":"0ddf91ff-6d91-4213-8032-05f80408063d","Type":"ContainerStarted","Data":"351d413bf7d307a846b7ee3c09d2f3de72f144af8aee0b96e7aead7cb3ad5f6a"} Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.768350 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" podUID="0ddf91ff-6d91-4213-8032-05f80408063d" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.771994 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" event={"ID":"5d318732-8194-49eb-a2a3-c5b13ce843a7","Type":"ContainerStarted","Data":"4df0eb5dea2ea13c7cee2efe31de48244b2d58a8ae118dd26ce26d06cd4001ab"} Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.774444 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" podUID="5d318732-8194-49eb-a2a3-c5b13ce843a7" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.784000 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" event={"ID":"b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5","Type":"ContainerStarted","Data":"5958551027c88a2fe056b2a53c574ce8e91ed926acaca7d47471f3fb901d2d49"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.787201 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" event={"ID":"72237264-5d09-40bd-ba83-f30b76790cb6","Type":"ContainerStarted","Data":"2e2820d9861887138ed0039a8cd963a27c345037072d95c861f80f47f028fbbf"} Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.807426 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" podUID="d7932403-615f-44e4-b195-4a83c19787ba" Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.807943 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" podUID="282f8f05-9a84-4bb4-a122-ba8806324ca3" Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.808605 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" podUID="6bbceab2-fe2b-4693-867d-aa2a51261611" Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.808686 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" podUID="4126be7d-7ca8-4e68-94d4-ea21644fbd85" Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.808906 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" podUID="0ddf91ff-6d91-4213-8032-05f80408063d" Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.809601 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" podUID="5d318732-8194-49eb-a2a3-c5b13ce843a7" Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.811723 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922\\\"\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" podUID="d1702062-37ba-43c0-becb-005e11f457a0" Mar 11 12:15:25 crc kubenswrapper[4816]: I0311 12:15:25.961990 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.962169 4816 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.962237 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert podName:a605e964-6e3c-4639-95d5-908f5d0ab7ef nodeName:}" failed. No retries permitted until 2026-03-11 12:15:29.962216137 +0000 UTC m=+1016.553480104 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert") pod "infra-operator-controller-manager-5995f4446f-hzd9q" (UID: "a605e964-6e3c-4639-95d5-908f5d0ab7ef") : secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:26 crc kubenswrapper[4816]: I0311 12:15:26.172978 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:26 crc kubenswrapper[4816]: E0311 12:15:26.173129 4816 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:26 crc kubenswrapper[4816]: E0311 12:15:26.173180 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert podName:78a7aebd-70a2-4608-a669-aea496cb6186 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:30.173165815 +0000 UTC m=+1016.764429782 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" (UID: "78a7aebd-70a2-4608-a669-aea496cb6186") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:26 crc kubenswrapper[4816]: I0311 12:15:26.586184 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:26 crc kubenswrapper[4816]: I0311 12:15:26.586234 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:26 crc kubenswrapper[4816]: E0311 12:15:26.586361 4816 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 12:15:26 crc kubenswrapper[4816]: E0311 12:15:26.586401 4816 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 12:15:26 crc kubenswrapper[4816]: E0311 12:15:26.586443 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:30.58642246 +0000 UTC m=+1017.177686417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "webhook-server-cert" not found Mar 11 12:15:26 crc kubenswrapper[4816]: E0311 12:15:26.586462 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:30.586454631 +0000 UTC m=+1017.177718598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "metrics-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: I0311 12:15:30.038472 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.038657 4816 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.039246 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert podName:a605e964-6e3c-4639-95d5-908f5d0ab7ef nodeName:}" failed. No retries permitted until 2026-03-11 12:15:38.039227695 +0000 UTC m=+1024.630491662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert") pod "infra-operator-controller-manager-5995f4446f-hzd9q" (UID: "a605e964-6e3c-4639-95d5-908f5d0ab7ef") : secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: I0311 12:15:30.242411 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.243467 4816 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.243507 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert podName:78a7aebd-70a2-4608-a669-aea496cb6186 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:38.243493799 +0000 UTC m=+1024.834757766 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" (UID: "78a7aebd-70a2-4608-a669-aea496cb6186") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: I0311 12:15:30.650848 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:30 crc kubenswrapper[4816]: I0311 12:15:30.650896 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.651065 4816 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.651072 4816 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.651123 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:38.65110797 +0000 UTC m=+1025.242371937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "metrics-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.651151 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:38.651132841 +0000 UTC m=+1025.242396808 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "webhook-server-cert" not found Mar 11 12:15:36 crc kubenswrapper[4816]: E0311 12:15:36.373231 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721" Mar 11 12:15:36 crc kubenswrapper[4816]: E0311 12:15:36.373907 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-58dj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-776c5696bf-h2vmc_openstack-operators(4d4c74ff-52a2-4426-bd06-daa6e9b1a832): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:15:36 crc kubenswrapper[4816]: E0311 12:15:36.375310 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" podUID="4d4c74ff-52a2-4426-bd06-daa6e9b1a832" Mar 11 12:15:36 crc kubenswrapper[4816]: E0311 12:15:36.908957 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" podUID="4d4c74ff-52a2-4426-bd06-daa6e9b1a832" Mar 11 12:15:37 crc kubenswrapper[4816]: E0311 12:15:37.041545 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571" Mar 11 12:15:37 crc kubenswrapper[4816]: E0311 12:15:37.041758 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9b9sn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-56fsw_openstack-operators(b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:15:37 crc kubenswrapper[4816]: E0311 12:15:37.042865 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" podUID="b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5" Mar 11 12:15:37 crc kubenswrapper[4816]: E0311 12:15:37.915637 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" podUID="b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5" Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.075893 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.085177 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.278601 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:38 crc kubenswrapper[4816]: E0311 12:15:38.278834 4816 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:38 crc kubenswrapper[4816]: E0311 12:15:38.279461 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert podName:78a7aebd-70a2-4608-a669-aea496cb6186 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:54.279436905 +0000 UTC m=+1040.870700872 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" (UID: "78a7aebd-70a2-4608-a669-aea496cb6186") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.321345 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vxmbd" Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.330433 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.685150 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.685208 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:38 crc kubenswrapper[4816]: E0311 12:15:38.685314 4816 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 12:15:38 crc kubenswrapper[4816]: E0311 12:15:38.685379 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:54.685360678 +0000 UTC m=+1041.276624645 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "webhook-server-cert" not found Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.697552 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.388557 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.389408 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nhgkz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-zczdq_openstack-operators(73e00d02-6599-4cab-a32b-8fe96b82951a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.390549 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" podUID="73e00d02-6599-4cab-a32b-8fe96b82951a" Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.515446 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.515501 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.515538 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.515952 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13e7eed3f44dcb7bba59d21f6a1bb4bc9f4b869b7a25106a79ff8ceef1b9e507"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.516006 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://13e7eed3f44dcb7bba59d21f6a1bb4bc9f4b869b7a25106a79ff8ceef1b9e507" gracePeriod=600 Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.882329 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.882614 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8j895,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-dnqpf_openstack-operators(8e810ef6-d3f5-4133-bce2-234df32b3d10): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.883901 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" podUID="8e810ef6-d3f5-4133-bce2-234df32b3d10" Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.939317 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="13e7eed3f44dcb7bba59d21f6a1bb4bc9f4b869b7a25106a79ff8ceef1b9e507" exitCode=0 Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.939381 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"13e7eed3f44dcb7bba59d21f6a1bb4bc9f4b869b7a25106a79ff8ceef1b9e507"} Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.939487 4816 scope.go:117] "RemoveContainer" containerID="45ccbed932001dc629a77de7e08e04a9cce25a78ac1e00aed407f7f4e1fa93a3" Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.946388 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" podUID="73e00d02-6599-4cab-a32b-8fe96b82951a" Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.946502 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" podUID="8e810ef6-d3f5-4133-bce2-234df32b3d10" Mar 11 12:15:40 crc kubenswrapper[4816]: I0311 12:15:40.278193 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q"] Mar 11 12:15:41 crc kubenswrapper[4816]: W0311 12:15:41.711024 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda605e964_6e3c_4639_95d5_908f5d0ab7ef.slice/crio-feb5aeba86168a60d262043807101ac42fc06cd533d933f3e3c564d62c04538d WatchSource:0}: Error finding container feb5aeba86168a60d262043807101ac42fc06cd533d933f3e3c564d62c04538d: Status 404 returned error can't find the container with id feb5aeba86168a60d262043807101ac42fc06cd533d933f3e3c564d62c04538d Mar 11 12:15:41 crc kubenswrapper[4816]: I0311 12:15:41.962601 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" event={"ID":"a605e964-6e3c-4639-95d5-908f5d0ab7ef","Type":"ContainerStarted","Data":"feb5aeba86168a60d262043807101ac42fc06cd533d933f3e3c564d62c04538d"} Mar 11 12:15:43 crc kubenswrapper[4816]: I0311 12:15:43.982299 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" event={"ID":"c28c6622-633e-4e76-9c9a-eb732531fa1a","Type":"ContainerStarted","Data":"c8430addf987a22f8b7f9cc01817da9117aa40bc7cc4e4054f46452d139bc56d"} Mar 11 12:15:43 crc kubenswrapper[4816]: I0311 12:15:43.982658 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" Mar 11 12:15:43 crc kubenswrapper[4816]: I0311 12:15:43.984468 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"92bc406893843c03ac9aa6138b10c838c501d62aa37baf4b9b92254baf796e96"} Mar 11 12:15:44 crc kubenswrapper[4816]: I0311 12:15:44.004821 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" podStartSLOduration=6.848546235 podStartE2EDuration="23.00479847s" podCreationTimestamp="2026-03-11 12:15:21 +0000 UTC" firstStartedPulling="2026-03-11 12:15:23.706917619 +0000 UTC m=+1010.298181586" lastFinishedPulling="2026-03-11 12:15:39.863169854 +0000 UTC m=+1026.454433821" observedRunningTime="2026-03-11 12:15:43.995681105 +0000 UTC m=+1030.586945072" watchObservedRunningTime="2026-03-11 12:15:44.00479847 +0000 UTC m=+1030.596062437" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.010278 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" event={"ID":"5d318732-8194-49eb-a2a3-c5b13ce843a7","Type":"ContainerStarted","Data":"abbfd80f58229b6d72ae7f8d78676a114f0f8da1cd2b73f61e3e29cae274284e"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.011996 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.013549 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" event={"ID":"72237264-5d09-40bd-ba83-f30b76790cb6","Type":"ContainerStarted","Data":"942341a5281631241462d79a359f97f11ed7566ccb1699d0364551d4d5ef5be0"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.013964 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.015824 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" event={"ID":"282f8f05-9a84-4bb4-a122-ba8806324ca3","Type":"ContainerStarted","Data":"5d515bb111e24b12c705fd89c78f2c56f37c61e64977287b82d743ee6f0d6fcf"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.016427 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.017729 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" event={"ID":"4126be7d-7ca8-4e68-94d4-ea21644fbd85","Type":"ContainerStarted","Data":"2b476045814439037242972d218641bae9ea40b03ddb8c8b0d4749bffcc6a4d0"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.018130 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.019332 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" event={"ID":"d7932403-615f-44e4-b195-4a83c19787ba","Type":"ContainerStarted","Data":"04b4f259d59a13b69141b50f407751c9a85a711831daa0770f651c45421e2ed1"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.019478 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.020339 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" event={"ID":"b941b0f1-4a8f-4517-af46-cc77892fe3d9","Type":"ContainerStarted","Data":"659931bfb6136077eba8a1ce9f1ad5c42f7a0daa2b4d85ec12342fe8e643d9a1"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.020682 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.022600 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" event={"ID":"6bbceab2-fe2b-4693-867d-aa2a51261611","Type":"ContainerStarted","Data":"cfaaefd9338b4d804d9d444e4f30c8446fc9961809ff5cff4353ee48c602e8b0"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.022765 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.024121 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" event={"ID":"0ddf91ff-6d91-4213-8032-05f80408063d","Type":"ContainerStarted","Data":"dd5fef126a8fd8da0671e4f30b167fffe0b57b171babb991338633f629165ad9"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.024568 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.025897 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" event={"ID":"9e0c8832-9c20-44a9-933c-4a7fff032367","Type":"ContainerStarted","Data":"691fface6b191965f93bb720406f44bbae035a07c03ac4459acb5d5a0c9b2faf"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.026261 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.027334 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" event={"ID":"a8133b64-eb11-43ad-bf6e-a278af0ff466","Type":"ContainerStarted","Data":"d419c87fd8a27b5e0600fdd812f665f40abb8b3227009bcd8a1a0f3bc3f08690"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.027707 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.028678 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" event={"ID":"a605e964-6e3c-4639-95d5-908f5d0ab7ef","Type":"ContainerStarted","Data":"b51333b0db15e7eddfdf1012e14dd0d99fc098612ec285dc040a9637da6c5c69"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.029070 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.032441 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" event={"ID":"f37fb9b3-7b07-4188-b9ea-facfa5e945f0","Type":"ContainerStarted","Data":"3925199c7896620f3767b243b460c1bd6950fae8f4b788cafd55b81e4d76d5ec"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.032959 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.035749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" event={"ID":"bcfe1f90-2b5f-43b7-b798-0bad62ec53b2","Type":"ContainerStarted","Data":"0fd530b9837f3faa33a1e00ce94b6d75a468faab7a778abf88098233d30b4597"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.036040 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.040758 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" event={"ID":"6311ca5f-6f4c-4768-ae5e-75128be7f589","Type":"ContainerStarted","Data":"a47d527368a9a736ee8d9a5f49880bf1618c7aeef362a98017cef5b3e1d3d239"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.040897 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.047194 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" event={"ID":"d1702062-37ba-43c0-becb-005e11f457a0","Type":"ContainerStarted","Data":"9742b5fc0c9cc8109db8e87af51553670053bcde78357c7550eeee43432ad9f9"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.047885 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.048959 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" event={"ID":"e04ad395-8120-4c57-8575-611fa438e8fb","Type":"ContainerStarted","Data":"f1807fc6bfaf31ecad9a00ec9b1dc5ed0820d9d578634c1d64b844e0387a9479"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.049374 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.163201 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" podStartSLOduration=9.072622486 podStartE2EDuration="25.163182152s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:23.775009577 +0000 UTC m=+1010.366273544" lastFinishedPulling="2026-03-11 12:15:39.865569243 +0000 UTC m=+1026.456833210" observedRunningTime="2026-03-11 12:15:47.160270297 +0000 UTC m=+1033.751534264" watchObservedRunningTime="2026-03-11 12:15:47.163182152 +0000 UTC m=+1033.754446119" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.164073 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" podStartSLOduration=3.279869286 podStartE2EDuration="25.164067068s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.162064202 +0000 UTC m=+1010.753328169" lastFinishedPulling="2026-03-11 12:15:46.046261984 +0000 UTC m=+1032.637525951" observedRunningTime="2026-03-11 12:15:47.095110774 +0000 UTC m=+1033.686374741" watchObservedRunningTime="2026-03-11 12:15:47.164067068 +0000 UTC m=+1033.755331025" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.207449 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" podStartSLOduration=7.52338157 podStartE2EDuration="26.207432677s" podCreationTimestamp="2026-03-11 12:15:21 +0000 UTC" firstStartedPulling="2026-03-11 12:15:23.771232408 +0000 UTC m=+1010.362496375" lastFinishedPulling="2026-03-11 12:15:42.455283515 +0000 UTC m=+1029.046547482" observedRunningTime="2026-03-11 12:15:47.204679287 +0000 UTC m=+1033.795943244" watchObservedRunningTime="2026-03-11 12:15:47.207432677 +0000 UTC m=+1033.798696644" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.236915 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" podStartSLOduration=9.729260721 podStartE2EDuration="26.236892123s" podCreationTimestamp="2026-03-11 12:15:21 +0000 UTC" firstStartedPulling="2026-03-11 12:15:23.344360517 +0000 UTC m=+1009.935624494" lastFinishedPulling="2026-03-11 12:15:39.851991929 +0000 UTC m=+1026.443255896" observedRunningTime="2026-03-11 12:15:47.230548179 +0000 UTC m=+1033.821812146" watchObservedRunningTime="2026-03-11 12:15:47.236892123 +0000 UTC m=+1033.828156090" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.318940 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" podStartSLOduration=9.593496628 podStartE2EDuration="25.318921146s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.126678304 +0000 UTC m=+1010.717942271" lastFinishedPulling="2026-03-11 12:15:39.852102822 +0000 UTC m=+1026.443366789" observedRunningTime="2026-03-11 12:15:47.308909355 +0000 UTC m=+1033.900173322" watchObservedRunningTime="2026-03-11 12:15:47.318921146 +0000 UTC m=+1033.910185113" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.361489 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" podStartSLOduration=3.415124363 podStartE2EDuration="25.361469592s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.188285553 +0000 UTC m=+1010.779549520" lastFinishedPulling="2026-03-11 12:15:46.134630772 +0000 UTC m=+1032.725894749" observedRunningTime="2026-03-11 12:15:47.359538006 +0000 UTC m=+1033.950801973" watchObservedRunningTime="2026-03-11 12:15:47.361469592 +0000 UTC m=+1033.952733559" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.405162 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" podStartSLOduration=7.03112657 podStartE2EDuration="25.405144941s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.14102001 +0000 UTC m=+1010.732283977" lastFinishedPulling="2026-03-11 12:15:42.515038381 +0000 UTC m=+1029.106302348" observedRunningTime="2026-03-11 12:15:47.403315388 +0000 UTC m=+1033.994579345" watchObservedRunningTime="2026-03-11 12:15:47.405144941 +0000 UTC m=+1033.996408908" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.453764 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" podStartSLOduration=7.746052859 podStartE2EDuration="26.453747183s" podCreationTimestamp="2026-03-11 12:15:21 +0000 UTC" firstStartedPulling="2026-03-11 12:15:23.747588381 +0000 UTC m=+1010.338852348" lastFinishedPulling="2026-03-11 12:15:42.455282705 +0000 UTC m=+1029.046546672" observedRunningTime="2026-03-11 12:15:47.449889391 +0000 UTC m=+1034.041153358" watchObservedRunningTime="2026-03-11 12:15:47.453747183 +0000 UTC m=+1034.045011150" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.496785 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" podStartSLOduration=21.165345343 podStartE2EDuration="25.496766303s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:41.71468235 +0000 UTC m=+1028.305946317" lastFinishedPulling="2026-03-11 12:15:46.04610331 +0000 UTC m=+1032.637367277" observedRunningTime="2026-03-11 12:15:47.492518649 +0000 UTC m=+1034.083782616" watchObservedRunningTime="2026-03-11 12:15:47.496766303 +0000 UTC m=+1034.088030270" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.541598 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" podStartSLOduration=6.262499231 podStartE2EDuration="25.541583175s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.149792425 +0000 UTC m=+1010.741056382" lastFinishedPulling="2026-03-11 12:15:43.428876359 +0000 UTC m=+1030.020140326" observedRunningTime="2026-03-11 12:15:47.540834153 +0000 UTC m=+1034.132098120" watchObservedRunningTime="2026-03-11 12:15:47.541583175 +0000 UTC m=+1034.132847142" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.644575 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" podStartSLOduration=7.258962549 podStartE2EDuration="25.644556066s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.21707642 +0000 UTC m=+1010.808340387" lastFinishedPulling="2026-03-11 12:15:42.602669937 +0000 UTC m=+1029.193933904" observedRunningTime="2026-03-11 12:15:47.572553425 +0000 UTC m=+1034.163817392" watchObservedRunningTime="2026-03-11 12:15:47.644556066 +0000 UTC m=+1034.235820033" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.671897 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" podStartSLOduration=7.996287689 podStartE2EDuration="26.67187528s" podCreationTimestamp="2026-03-11 12:15:21 +0000 UTC" firstStartedPulling="2026-03-11 12:15:23.779801577 +0000 UTC m=+1010.371065544" lastFinishedPulling="2026-03-11 12:15:42.455389168 +0000 UTC m=+1029.046653135" observedRunningTime="2026-03-11 12:15:47.646772241 +0000 UTC m=+1034.238036198" watchObservedRunningTime="2026-03-11 12:15:47.67187528 +0000 UTC m=+1034.263139247" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.675260 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" podStartSLOduration=9.107292824 podStartE2EDuration="25.675235458s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.149422424 +0000 UTC m=+1010.740686391" lastFinishedPulling="2026-03-11 12:15:40.717365058 +0000 UTC m=+1027.308629025" observedRunningTime="2026-03-11 12:15:47.670577962 +0000 UTC m=+1034.261841929" watchObservedRunningTime="2026-03-11 12:15:47.675235458 +0000 UTC m=+1034.266499425" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.701331 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" podStartSLOduration=7.282537434 podStartE2EDuration="25.701315215s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.184244136 +0000 UTC m=+1010.775508103" lastFinishedPulling="2026-03-11 12:15:42.603021917 +0000 UTC m=+1029.194285884" observedRunningTime="2026-03-11 12:15:47.697677699 +0000 UTC m=+1034.288941666" watchObservedRunningTime="2026-03-11 12:15:47.701315215 +0000 UTC m=+1034.292579182" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.760569 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" podStartSLOduration=3.924233864 podStartE2EDuration="25.760549896s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.248584395 +0000 UTC m=+1010.839848362" lastFinishedPulling="2026-03-11 12:15:46.084900437 +0000 UTC m=+1032.676164394" observedRunningTime="2026-03-11 12:15:47.757987551 +0000 UTC m=+1034.349251518" watchObservedRunningTime="2026-03-11 12:15:47.760549896 +0000 UTC m=+1034.351813863" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.761129 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" podStartSLOduration=7.390835301 podStartE2EDuration="25.761122523s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.232985602 +0000 UTC m=+1010.824249579" lastFinishedPulling="2026-03-11 12:15:42.603272834 +0000 UTC m=+1029.194536801" observedRunningTime="2026-03-11 12:15:47.732183732 +0000 UTC m=+1034.323447689" watchObservedRunningTime="2026-03-11 12:15:47.761122523 +0000 UTC m=+1034.352386490" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.101873 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" event={"ID":"4d4c74ff-52a2-4426-bd06-daa6e9b1a832","Type":"ContainerStarted","Data":"1499fc908d58d80a97b46f8ad0f79220c2cf1db2554d3028ff7b0f181f57d55e"} Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.102610 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.106588 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" event={"ID":"8e810ef6-d3f5-4133-bce2-234df32b3d10","Type":"ContainerStarted","Data":"d53f55a3ba7f10ddf2d38af4360c3e85b1eb5efe7e8538fc71ea168aeadcf553"} Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.120216 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" podStartSLOduration=2.28625565 podStartE2EDuration="30.120189264s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.020210881 +0000 UTC m=+1010.611474858" lastFinishedPulling="2026-03-11 12:15:51.854144495 +0000 UTC m=+1038.445408472" observedRunningTime="2026-03-11 12:15:52.119099612 +0000 UTC m=+1038.710363599" watchObservedRunningTime="2026-03-11 12:15:52.120189264 +0000 UTC m=+1038.711453241" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.145019 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" podStartSLOduration=2.506327663 podStartE2EDuration="30.144990265s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.126968902 +0000 UTC m=+1010.718232869" lastFinishedPulling="2026-03-11 12:15:51.765631494 +0000 UTC m=+1038.356895471" observedRunningTime="2026-03-11 12:15:52.138444414 +0000 UTC m=+1038.729708391" watchObservedRunningTime="2026-03-11 12:15:52.144990265 +0000 UTC m=+1038.736254252" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.246993 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.252986 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.288134 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.361094 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.390529 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.405564 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.454614 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.670793 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.681239 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.714975 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.737995 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.768290 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.827557 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.856519 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.122451 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" event={"ID":"73e00d02-6599-4cab-a32b-8fe96b82951a","Type":"ContainerStarted","Data":"4d2d5d8b030b845e614b1392c2879fe10232fb99a2bd70f5e27a764161adf148"} Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.122650 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.123677 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" event={"ID":"b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5","Type":"ContainerStarted","Data":"214ad0558a154141454764000bc45f7800c0efc1fa8bb9b88e0d9c3486659348"} Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.123848 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.153334 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" podStartSLOduration=2.296183687 podStartE2EDuration="31.153317327s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:23.758104516 +0000 UTC m=+1010.349368483" lastFinishedPulling="2026-03-11 12:15:52.615238156 +0000 UTC m=+1039.206502123" observedRunningTime="2026-03-11 12:15:53.152919626 +0000 UTC m=+1039.744183603" watchObservedRunningTime="2026-03-11 12:15:53.153317327 +0000 UTC m=+1039.744581294" Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.159718 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.170584 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" podStartSLOduration=2.700443972 podStartE2EDuration="31.170568968s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.145888462 +0000 UTC m=+1010.737152429" lastFinishedPulling="2026-03-11 12:15:52.616013448 +0000 UTC m=+1039.207277425" observedRunningTime="2026-03-11 12:15:53.168307383 +0000 UTC m=+1039.759571350" watchObservedRunningTime="2026-03-11 12:15:53.170568968 +0000 UTC m=+1039.761832935" Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.212824 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.375743 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.395488 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.622435 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-8r4xj" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.641758 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.784535 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.797295 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.840665 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vwbwj" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.843382 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.992284 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l"] Mar 11 12:15:55 crc kubenswrapper[4816]: I0311 12:15:55.151316 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" event={"ID":"78a7aebd-70a2-4608-a669-aea496cb6186","Type":"ContainerStarted","Data":"ff8e23ab061531f91817bf87163f31937a8930b4b4b91fa3df169233f32e38c0"} Mar 11 12:15:55 crc kubenswrapper[4816]: I0311 12:15:55.170041 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6"] Mar 11 12:15:55 crc kubenswrapper[4816]: W0311 12:15:55.179389 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f4b0b09_5704_432a_9cd4_82a296f3c467.slice/crio-44c4bbb51f6801e313e560f23d15f817744193d1651889d60124f9eac4783cc8 WatchSource:0}: Error finding container 44c4bbb51f6801e313e560f23d15f817744193d1651889d60124f9eac4783cc8: Status 404 returned error can't find the container with id 44c4bbb51f6801e313e560f23d15f817744193d1651889d60124f9eac4783cc8 Mar 11 12:15:56 crc kubenswrapper[4816]: I0311 12:15:56.159101 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" event={"ID":"5f4b0b09-5704-432a-9cd4-82a296f3c467","Type":"ContainerStarted","Data":"44c4bbb51f6801e313e560f23d15f817744193d1651889d60124f9eac4783cc8"} Mar 11 12:15:58 crc kubenswrapper[4816]: I0311 12:15:58.339479 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.143214 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553856-7k69r"] Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.144474 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553856-7k69r" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.152850 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553856-7k69r"] Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.153774 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.154068 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.154417 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.165659 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5dch\" (UniqueName: \"kubernetes.io/projected/79fb6b17-9d8a-4f10-8a93-a3e65f470a27-kube-api-access-z5dch\") pod \"auto-csr-approver-29553856-7k69r\" (UID: \"79fb6b17-9d8a-4f10-8a93-a3e65f470a27\") " pod="openshift-infra/auto-csr-approver-29553856-7k69r" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.266435 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5dch\" (UniqueName: \"kubernetes.io/projected/79fb6b17-9d8a-4f10-8a93-a3e65f470a27-kube-api-access-z5dch\") pod \"auto-csr-approver-29553856-7k69r\" (UID: \"79fb6b17-9d8a-4f10-8a93-a3e65f470a27\") " pod="openshift-infra/auto-csr-approver-29553856-7k69r" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.285075 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5dch\" (UniqueName: \"kubernetes.io/projected/79fb6b17-9d8a-4f10-8a93-a3e65f470a27-kube-api-access-z5dch\") pod \"auto-csr-approver-29553856-7k69r\" (UID: \"79fb6b17-9d8a-4f10-8a93-a3e65f470a27\") " pod="openshift-infra/auto-csr-approver-29553856-7k69r" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.462992 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553856-7k69r" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.933942 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553856-7k69r"] Mar 11 12:16:00 crc kubenswrapper[4816]: W0311 12:16:00.941667 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79fb6b17_9d8a_4f10_8a93_a3e65f470a27.slice/crio-7e3f4449a02d0a67344714e7731e0eb869633b7a8599cad40fa72cf10c713c02 WatchSource:0}: Error finding container 7e3f4449a02d0a67344714e7731e0eb869633b7a8599cad40fa72cf10c713c02: Status 404 returned error can't find the container with id 7e3f4449a02d0a67344714e7731e0eb869633b7a8599cad40fa72cf10c713c02 Mar 11 12:16:01 crc kubenswrapper[4816]: I0311 12:16:01.205180 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553856-7k69r" event={"ID":"79fb6b17-9d8a-4f10-8a93-a3e65f470a27","Type":"ContainerStarted","Data":"7e3f4449a02d0a67344714e7731e0eb869633b7a8599cad40fa72cf10c713c02"} Mar 11 12:16:02 crc kubenswrapper[4816]: I0311 12:16:02.214053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" event={"ID":"5f4b0b09-5704-432a-9cd4-82a296f3c467","Type":"ContainerStarted","Data":"dbe7e8da8b68665fef8578c00790d8dbd642f082ba9bdeeefe39c1b2da581690"} Mar 11 12:16:02 crc kubenswrapper[4816]: I0311 12:16:02.214468 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:16:02 crc kubenswrapper[4816]: I0311 12:16:02.252554 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" podStartSLOduration=40.252524851 podStartE2EDuration="40.252524851s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:16:02.251962794 +0000 UTC m=+1048.843226771" watchObservedRunningTime="2026-03-11 12:16:02.252524851 +0000 UTC m=+1048.843788858" Mar 11 12:16:02 crc kubenswrapper[4816]: I0311 12:16:02.464317 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" Mar 11 12:16:02 crc kubenswrapper[4816]: I0311 12:16:02.539792 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" Mar 11 12:16:02 crc kubenswrapper[4816]: I0311 12:16:02.694449 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" Mar 11 12:16:07 crc kubenswrapper[4816]: I0311 12:16:07.250367 4816 generic.go:334] "Generic (PLEG): container finished" podID="79fb6b17-9d8a-4f10-8a93-a3e65f470a27" containerID="5d6df61e0b509a66b3346da65b74fba3a74851e8e005a57c5d0fba5a7957a438" exitCode=0 Mar 11 12:16:07 crc kubenswrapper[4816]: I0311 12:16:07.250459 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553856-7k69r" event={"ID":"79fb6b17-9d8a-4f10-8a93-a3e65f470a27","Type":"ContainerDied","Data":"5d6df61e0b509a66b3346da65b74fba3a74851e8e005a57c5d0fba5a7957a438"} Mar 11 12:16:07 crc kubenswrapper[4816]: I0311 12:16:07.253495 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" event={"ID":"78a7aebd-70a2-4608-a669-aea496cb6186","Type":"ContainerStarted","Data":"d47df04afa1e477a610ffc4e9a31784b39656f4d4e8075507e9d6673b54137b9"} Mar 11 12:16:07 crc kubenswrapper[4816]: I0311 12:16:07.253593 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:16:07 crc kubenswrapper[4816]: I0311 12:16:07.302677 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" podStartSLOduration=33.910327758 podStartE2EDuration="45.302659949s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:55.002824565 +0000 UTC m=+1041.594088532" lastFinishedPulling="2026-03-11 12:16:06.395156756 +0000 UTC m=+1052.986420723" observedRunningTime="2026-03-11 12:16:07.299046034 +0000 UTC m=+1053.890310011" watchObservedRunningTime="2026-03-11 12:16:07.302659949 +0000 UTC m=+1053.893923926" Mar 11 12:16:08 crc kubenswrapper[4816]: I0311 12:16:08.608817 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553856-7k69r" Mar 11 12:16:08 crc kubenswrapper[4816]: I0311 12:16:08.710476 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5dch\" (UniqueName: \"kubernetes.io/projected/79fb6b17-9d8a-4f10-8a93-a3e65f470a27-kube-api-access-z5dch\") pod \"79fb6b17-9d8a-4f10-8a93-a3e65f470a27\" (UID: \"79fb6b17-9d8a-4f10-8a93-a3e65f470a27\") " Mar 11 12:16:08 crc kubenswrapper[4816]: I0311 12:16:08.716367 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79fb6b17-9d8a-4f10-8a93-a3e65f470a27-kube-api-access-z5dch" (OuterVolumeSpecName: "kube-api-access-z5dch") pod "79fb6b17-9d8a-4f10-8a93-a3e65f470a27" (UID: "79fb6b17-9d8a-4f10-8a93-a3e65f470a27"). InnerVolumeSpecName "kube-api-access-z5dch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:16:08 crc kubenswrapper[4816]: I0311 12:16:08.811771 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5dch\" (UniqueName: \"kubernetes.io/projected/79fb6b17-9d8a-4f10-8a93-a3e65f470a27-kube-api-access-z5dch\") on node \"crc\" DevicePath \"\"" Mar 11 12:16:09 crc kubenswrapper[4816]: I0311 12:16:09.273499 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553856-7k69r" event={"ID":"79fb6b17-9d8a-4f10-8a93-a3e65f470a27","Type":"ContainerDied","Data":"7e3f4449a02d0a67344714e7731e0eb869633b7a8599cad40fa72cf10c713c02"} Mar 11 12:16:09 crc kubenswrapper[4816]: I0311 12:16:09.273558 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e3f4449a02d0a67344714e7731e0eb869633b7a8599cad40fa72cf10c713c02" Mar 11 12:16:09 crc kubenswrapper[4816]: I0311 12:16:09.273586 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553856-7k69r" Mar 11 12:16:09 crc kubenswrapper[4816]: I0311 12:16:09.738319 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553850-v7tlf"] Mar 11 12:16:09 crc kubenswrapper[4816]: I0311 12:16:09.744044 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553850-v7tlf"] Mar 11 12:16:10 crc kubenswrapper[4816]: I0311 12:16:10.140013 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac620ec-72d5-4603-852f-8ba3f1ad0e9b" path="/var/lib/kubelet/pods/5ac620ec-72d5-4603-852f-8ba3f1ad0e9b/volumes" Mar 11 12:16:14 crc kubenswrapper[4816]: I0311 12:16:14.654922 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:16:14 crc kubenswrapper[4816]: I0311 12:16:14.848904 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.218575 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-dbfn9"] Mar 11 12:16:31 crc kubenswrapper[4816]: E0311 12:16:31.219370 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fb6b17-9d8a-4f10-8a93-a3e65f470a27" containerName="oc" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.219386 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fb6b17-9d8a-4f10-8a93-a3e65f470a27" containerName="oc" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.219533 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="79fb6b17-9d8a-4f10-8a93-a3e65f470a27" containerName="oc" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.220221 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.223658 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.223880 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.225050 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-sh8cd" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.225053 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.242166 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-dbfn9"] Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.301463 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64696987c5-bkgpq"] Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.305118 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.307217 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.313743 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-bkgpq"] Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.400835 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxrph\" (UniqueName: \"kubernetes.io/projected/0f513c34-8707-46dd-9b55-e953666df46c-kube-api-access-sxrph\") pod \"dnsmasq-dns-5448ff6dc7-dbfn9\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.400901 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f513c34-8707-46dd-9b55-e953666df46c-config\") pod \"dnsmasq-dns-5448ff6dc7-dbfn9\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.502198 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxrph\" (UniqueName: \"kubernetes.io/projected/0f513c34-8707-46dd-9b55-e953666df46c-kube-api-access-sxrph\") pod \"dnsmasq-dns-5448ff6dc7-dbfn9\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.502260 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-config\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.502298 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f513c34-8707-46dd-9b55-e953666df46c-config\") pod \"dnsmasq-dns-5448ff6dc7-dbfn9\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.502321 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-dns-svc\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.502343 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p56ws\" (UniqueName: \"kubernetes.io/projected/e986d513-8aa0-4908-b200-d6212f56cd0f-kube-api-access-p56ws\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.503171 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f513c34-8707-46dd-9b55-e953666df46c-config\") pod \"dnsmasq-dns-5448ff6dc7-dbfn9\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.529149 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxrph\" (UniqueName: \"kubernetes.io/projected/0f513c34-8707-46dd-9b55-e953666df46c-kube-api-access-sxrph\") pod \"dnsmasq-dns-5448ff6dc7-dbfn9\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.537763 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.603288 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-config\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.603685 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-dns-svc\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.603722 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p56ws\" (UniqueName: \"kubernetes.io/projected/e986d513-8aa0-4908-b200-d6212f56cd0f-kube-api-access-p56ws\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.604422 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-dns-svc\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.604445 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-config\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.655303 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p56ws\" (UniqueName: \"kubernetes.io/projected/e986d513-8aa0-4908-b200-d6212f56cd0f-kube-api-access-p56ws\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.831755 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-dbfn9"] Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.839938 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.918347 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:32 crc kubenswrapper[4816]: I0311 12:16:32.192367 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-bkgpq"] Mar 11 12:16:32 crc kubenswrapper[4816]: I0311 12:16:32.460640 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" event={"ID":"0f513c34-8707-46dd-9b55-e953666df46c","Type":"ContainerStarted","Data":"34349f2681e98adca418f57aa55e4bcf6f5a91d14ddfb746d02fa6d79fb45869"} Mar 11 12:16:32 crc kubenswrapper[4816]: I0311 12:16:32.461436 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-bkgpq" event={"ID":"e986d513-8aa0-4908-b200-d6212f56cd0f","Type":"ContainerStarted","Data":"6c939d71a23fbd96f7ac8915514c1d72476d3ae7287584fbe750ce02fb1ef302"} Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.329199 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-dbfn9"] Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.359107 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-wc7mw"] Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.360829 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.370036 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-wc7mw"] Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.482256 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.482325 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-config\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.482358 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfcps\" (UniqueName: \"kubernetes.io/projected/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-kube-api-access-bfcps\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.585760 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.585825 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-config\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.585859 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfcps\" (UniqueName: \"kubernetes.io/projected/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-kube-api-access-bfcps\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.587109 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-config\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.587211 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.629633 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfcps\" (UniqueName: \"kubernetes.io/projected/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-kube-api-access-bfcps\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.679746 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.777613 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-bkgpq"] Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.809590 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-ngbb2"] Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.811244 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.824400 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-ngbb2"] Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.828843 4816 scope.go:117] "RemoveContainer" containerID="c3ad155fc5f3f7204d5fb77b61c79c6603bc6f42436d74dbc3171b2dbf21bbd2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.892823 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-config\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.892902 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw7f2\" (UniqueName: \"kubernetes.io/projected/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-kube-api-access-hw7f2\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.892951 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.994331 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-config\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.994389 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw7f2\" (UniqueName: \"kubernetes.io/projected/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-kube-api-access-hw7f2\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.994427 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.995378 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-config\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.995392 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.014808 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw7f2\" (UniqueName: \"kubernetes.io/projected/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-kube-api-access-hw7f2\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.137968 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.365788 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-wc7mw"] Mar 11 12:16:35 crc kubenswrapper[4816]: W0311 12:16:35.367239 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80db2c12_e3f3_4f0e_8201_435f1a0b27c5.slice/crio-d6fb7e37a836f700eb5acfa41a76a70f710c794efd150ca6ee3fb1323c24aa37 WatchSource:0}: Error finding container d6fb7e37a836f700eb5acfa41a76a70f710c794efd150ca6ee3fb1323c24aa37: Status 404 returned error can't find the container with id d6fb7e37a836f700eb5acfa41a76a70f710c794efd150ca6ee3fb1323c24aa37 Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.496292 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.497672 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.504945 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.506870 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.507047 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.507261 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.507433 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.507594 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-78f5m" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.507739 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.526171 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.529572 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" event={"ID":"80db2c12-e3f3-4f0e-8201-435f1a0b27c5","Type":"ContainerStarted","Data":"d6fb7e37a836f700eb5acfa41a76a70f710c794efd150ca6ee3fb1323c24aa37"} Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.618080 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3779c0f5-9084-4c07-83d9-fe2017559f7b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.618155 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3779c0f5-9084-4c07-83d9-fe2017559f7b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.618190 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.618210 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.618237 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.623451 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.623598 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.623667 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvr95\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-kube-api-access-mvr95\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.623714 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.623749 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.623789 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.687733 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-ngbb2"] Mar 11 12:16:35 crc kubenswrapper[4816]: W0311 12:16:35.712313 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fe27f2c_fcd4_42f9_8d14_9ad29dbf86b5.slice/crio-c77801a330291e43dbd716f0eaa0018246a0046f3623f314fd24c49712949d82 WatchSource:0}: Error finding container c77801a330291e43dbd716f0eaa0018246a0046f3623f314fd24c49712949d82: Status 404 returned error can't find the container with id c77801a330291e43dbd716f0eaa0018246a0046f3623f314fd24c49712949d82 Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.725621 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3779c0f5-9084-4c07-83d9-fe2017559f7b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.725681 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3779c0f5-9084-4c07-83d9-fe2017559f7b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.725715 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.725742 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.725782 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.725851 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.725901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.726142 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvr95\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-kube-api-access-mvr95\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.726168 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.726187 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.726207 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.728012 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.728021 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.728356 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.729992 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.732233 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.734347 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3779c0f5-9084-4c07-83d9-fe2017559f7b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.734364 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3779c0f5-9084-4c07-83d9-fe2017559f7b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.735212 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.734367 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.739899 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.743903 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvr95\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-kube-api-access-mvr95\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.751248 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.823301 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.922392 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.924381 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.943905 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.974394 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.974410 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.974412 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.974715 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.974956 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.974972 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-hgw2n" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.975409 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031293 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031392 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv8dl\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-kube-api-access-dv8dl\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031426 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26aea2df-f497-478d-b953-060189ef2569-pod-info\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031452 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031489 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031569 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031712 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031839 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-server-conf\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031982 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26aea2df-f497-478d-b953-060189ef2569-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.032037 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.032148 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.134611 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.134793 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.136651 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.136706 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-server-conf\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.136793 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26aea2df-f497-478d-b953-060189ef2569-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.136830 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.136878 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.136911 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.136964 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv8dl\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-kube-api-access-dv8dl\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.137009 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26aea2df-f497-478d-b953-060189ef2569-pod-info\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.137038 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.137068 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.137715 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-server-conf\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.138388 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.138800 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.139574 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.146557 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.147850 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.148329 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26aea2df-f497-478d-b953-060189ef2569-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.148385 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26aea2df-f497-478d-b953-060189ef2569-pod-info\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.158969 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.159771 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.163677 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv8dl\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-kube-api-access-dv8dl\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.165348 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.305819 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.544822 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3779c0f5-9084-4c07-83d9-fe2017559f7b","Type":"ContainerStarted","Data":"c73f7e4d7f0f4588b80903c0c3810420cc3aeed26ba2c6224b092ad58bda611c"} Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.546175 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" event={"ID":"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5","Type":"ContainerStarted","Data":"c77801a330291e43dbd716f0eaa0018246a0046f3623f314fd24c49712949d82"} Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.576737 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.578628 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.587159 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.588112 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bvh4z" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.588420 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.588932 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.590669 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.592934 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.647954 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-kolla-config\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.648039 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.648083 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-config-data-default\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.648125 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.648159 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.648180 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/da177cde-6332-4562-809a-d4bee453cebf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.648205 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.648247 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txrmx\" (UniqueName: \"kubernetes.io/projected/da177cde-6332-4562-809a-d4bee453cebf-kube-api-access-txrmx\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750258 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-kolla-config\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750336 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750404 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-config-data-default\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750469 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750508 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750534 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/da177cde-6332-4562-809a-d4bee453cebf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750563 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750614 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txrmx\" (UniqueName: \"kubernetes.io/projected/da177cde-6332-4562-809a-d4bee453cebf-kube-api-access-txrmx\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.751679 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-kolla-config\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.755245 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.756557 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-config-data-default\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.756896 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/da177cde-6332-4562-809a-d4bee453cebf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.756973 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.757111 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.759044 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.805974 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txrmx\" (UniqueName: \"kubernetes.io/projected/da177cde-6332-4562-809a-d4bee453cebf-kube-api-access-txrmx\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.874007 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.913192 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.112369 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.113993 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.116150 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-n5gxr" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.120277 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.120986 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.121012 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.175823 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.186854 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.186909 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.186951 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.186973 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.187001 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnzpb\" (UniqueName: \"kubernetes.io/projected/9a22173f-147b-46ac-bb01-596fe9f12b10-kube-api-access-cnzpb\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.187028 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.187067 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.187113 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.285321 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.286380 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.293959 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.293980 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-v9fqr" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.294225 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.296697 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.296825 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.296903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.296943 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.296996 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.297022 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.297051 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnzpb\" (UniqueName: \"kubernetes.io/projected/9a22173f-147b-46ac-bb01-596fe9f12b10-kube-api-access-cnzpb\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.297070 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.298991 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.301243 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.301537 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.302164 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.311936 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.319228 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.326652 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.330723 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.338487 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnzpb\" (UniqueName: \"kubernetes.io/projected/9a22173f-147b-46ac-bb01-596fe9f12b10-kube-api-access-cnzpb\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.357911 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.398533 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-kolla-config\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.398671 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.398786 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.398858 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7gcd\" (UniqueName: \"kubernetes.io/projected/5030028c-f574-4334-a837-2430761524b4-kube-api-access-d7gcd\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.398882 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-config-data\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.455801 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.500680 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.500761 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7gcd\" (UniqueName: \"kubernetes.io/projected/5030028c-f574-4334-a837-2430761524b4-kube-api-access-d7gcd\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.500792 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-config-data\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.500885 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-kolla-config\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.500934 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.505472 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-kolla-config\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.505769 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-config-data\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.518845 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.519007 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.521664 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7gcd\" (UniqueName: \"kubernetes.io/projected/5030028c-f574-4334-a837-2430761524b4-kube-api-access-d7gcd\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.691417 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.365546 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.367030 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.371269 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-h868s" Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.374565 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.436059 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbflg\" (UniqueName: \"kubernetes.io/projected/8e9e4e8b-b60c-4c37-974a-8bdc1b243135-kube-api-access-cbflg\") pod \"kube-state-metrics-0\" (UID: \"8e9e4e8b-b60c-4c37-974a-8bdc1b243135\") " pod="openstack/kube-state-metrics-0" Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.538567 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbflg\" (UniqueName: \"kubernetes.io/projected/8e9e4e8b-b60c-4c37-974a-8bdc1b243135-kube-api-access-cbflg\") pod \"kube-state-metrics-0\" (UID: \"8e9e4e8b-b60c-4c37-974a-8bdc1b243135\") " pod="openstack/kube-state-metrics-0" Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.561214 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbflg\" (UniqueName: \"kubernetes.io/projected/8e9e4e8b-b60c-4c37-974a-8bdc1b243135-kube-api-access-cbflg\") pod \"kube-state-metrics-0\" (UID: \"8e9e4e8b-b60c-4c37-974a-8bdc1b243135\") " pod="openstack/kube-state-metrics-0" Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.717527 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.880582 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-84rn8"] Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.882098 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8" Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.885135 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.885224 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-7f7bt" Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.885720 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.896470 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-84rn8"] Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.943495 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tnhfq"] Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.945009 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.956934 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tnhfq"] Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025301 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-run\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025341 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-lib\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025362 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7685\" (UniqueName: \"kubernetes.io/projected/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-kube-api-access-z7685\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025398 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2de58390-335b-40cc-8461-d931d3b22e41-scripts\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025424 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-combined-ca-bundle\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025554 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run-ovn\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025576 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-ovn-controller-tls-certs\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025606 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025637 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-etc-ovs\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025655 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-log-ovn\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025691 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-log\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025721 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-scripts\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025742 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpfnw\" (UniqueName: \"kubernetes.io/projected/2de58390-335b-40cc-8461-d931d3b22e41-kube-api-access-bpfnw\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.127157 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-lib\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.127209 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7685\" (UniqueName: \"kubernetes.io/projected/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-kube-api-access-z7685\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.127645 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2de58390-335b-40cc-8461-d931d3b22e41-scripts\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.127701 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-lib\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.130393 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2de58390-335b-40cc-8461-d931d3b22e41-scripts\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134546 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-combined-ca-bundle\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134701 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run-ovn\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134741 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-ovn-controller-tls-certs\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134804 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134859 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-etc-ovs\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134879 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-log-ovn\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134906 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-log\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134948 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-scripts\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134971 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpfnw\" (UniqueName: \"kubernetes.io/projected/2de58390-335b-40cc-8461-d931d3b22e41-kube-api-access-bpfnw\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.135045 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-run\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.135156 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run-ovn\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.135240 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-run\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.135559 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.141537 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-combined-ca-bundle\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.145187 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-etc-ovs\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.147092 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-scripts\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.147423 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-log-ovn\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.149454 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-log\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.149728 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-ovn-controller-tls-certs\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.164071 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7685\" (UniqueName: \"kubernetes.io/projected/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-kube-api-access-z7685\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.261918 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpfnw\" (UniqueName: \"kubernetes.io/projected/2de58390-335b-40cc-8461-d931d3b22e41-kube-api-access-bpfnw\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.321735 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.512785 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.724431 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.728556 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.732009 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.733692 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.733834 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.733956 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-xckrg" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.739066 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.745196 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874466 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4nnr\" (UniqueName: \"kubernetes.io/projected/e16e7d30-3235-44f2-81b4-c0c828071bbb-kube-api-access-r4nnr\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874534 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874604 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874678 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874736 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874768 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874796 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-config\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874823 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977399 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4nnr\" (UniqueName: \"kubernetes.io/projected/e16e7d30-3235-44f2-81b4-c0c828071bbb-kube-api-access-r4nnr\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977466 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977501 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977543 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977574 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977598 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977621 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-config\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977642 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.978020 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.978730 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.979221 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.979873 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-config\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.987089 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:45 crc kubenswrapper[4816]: I0311 12:16:44.992857 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:45 crc kubenswrapper[4816]: I0311 12:16:44.993794 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:45 crc kubenswrapper[4816]: I0311 12:16:44.999122 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4nnr\" (UniqueName: \"kubernetes.io/projected/e16e7d30-3235-44f2-81b4-c0c828071bbb-kube-api-access-r4nnr\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:45 crc kubenswrapper[4816]: I0311 12:16:45.009488 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:45 crc kubenswrapper[4816]: I0311 12:16:45.056832 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.418824 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.421392 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.423539 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.424479 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.424761 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.432204 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-zs9k7" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.436965 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.536390 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.536463 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wmq6\" (UniqueName: \"kubernetes.io/projected/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-kube-api-access-6wmq6\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.536497 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.536538 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-config\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.536573 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.536613 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.536638 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.537849 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.639920 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wmq6\" (UniqueName: \"kubernetes.io/projected/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-kube-api-access-6wmq6\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.639986 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.640033 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-config\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.640080 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.640142 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.640178 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.640224 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.640291 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.642086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.642402 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.642429 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-config\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.642475 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.649340 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.654492 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.672452 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.676687 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wmq6\" (UniqueName: \"kubernetes.io/projected/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-kube-api-access-6wmq6\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.685440 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.795171 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:51 crc kubenswrapper[4816]: I0311 12:16:51.068948 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.670761 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.671580 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sxrph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5448ff6dc7-dbfn9_openstack(0f513c34-8707-46dd-9b55-e953666df46c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.672898 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" podUID="0f513c34-8707-46dd-9b55-e953666df46c" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.676521 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.676765 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p56ws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-64696987c5-bkgpq_openstack(e986d513-8aa0-4908-b200-d6212f56cd0f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.678535 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-64696987c5-bkgpq" podUID="e986d513-8aa0-4908-b200-d6212f56cd0f" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.686465 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.686663 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bfcps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-658f55c9f5-wc7mw_openstack(80db2c12-e3f3-4f0e-8201-435f1a0b27c5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.688156 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" podUID="80db2c12-e3f3-4f0e-8201-435f1a0b27c5" Mar 11 12:16:55 crc kubenswrapper[4816]: I0311 12:16:55.772113 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8e9e4e8b-b60c-4c37-974a-8bdc1b243135","Type":"ContainerStarted","Data":"e914685ae7eb058c653bc79edb98cb710a39f5ce6911740300b8ce8933b04af8"} Mar 11 12:16:56 crc kubenswrapper[4816]: E0311 12:16:56.124698 4816 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 11 12:16:56 crc kubenswrapper[4816]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/80db2c12-e3f3-4f0e-8201-435f1a0b27c5/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 11 12:16:56 crc kubenswrapper[4816]: > podSandboxID="d6fb7e37a836f700eb5acfa41a76a70f710c794efd150ca6ee3fb1323c24aa37" Mar 11 12:16:56 crc kubenswrapper[4816]: E0311 12:16:56.125153 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:16:56 crc kubenswrapper[4816]: init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bfcps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-658f55c9f5-wc7mw_openstack(80db2c12-e3f3-4f0e-8201-435f1a0b27c5): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/80db2c12-e3f3-4f0e-8201-435f1a0b27c5/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 11 12:16:56 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 11 12:16:56 crc kubenswrapper[4816]: E0311 12:16:56.126319 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/80db2c12-e3f3-4f0e-8201-435f1a0b27c5/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" podUID="80db2c12-e3f3-4f0e-8201-435f1a0b27c5" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.368954 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.379902 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.530888 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-84rn8"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.537928 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxrph\" (UniqueName: \"kubernetes.io/projected/0f513c34-8707-46dd-9b55-e953666df46c-kube-api-access-sxrph\") pod \"0f513c34-8707-46dd-9b55-e953666df46c\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.538150 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-dns-svc\") pod \"e986d513-8aa0-4908-b200-d6212f56cd0f\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.539426 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e986d513-8aa0-4908-b200-d6212f56cd0f" (UID: "e986d513-8aa0-4908-b200-d6212f56cd0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.539534 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-config\") pod \"e986d513-8aa0-4908-b200-d6212f56cd0f\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.539565 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f513c34-8707-46dd-9b55-e953666df46c-config\") pod \"0f513c34-8707-46dd-9b55-e953666df46c\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.540725 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p56ws\" (UniqueName: \"kubernetes.io/projected/e986d513-8aa0-4908-b200-d6212f56cd0f-kube-api-access-p56ws\") pod \"e986d513-8aa0-4908-b200-d6212f56cd0f\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.540063 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-config" (OuterVolumeSpecName: "config") pod "e986d513-8aa0-4908-b200-d6212f56cd0f" (UID: "e986d513-8aa0-4908-b200-d6212f56cd0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.540520 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f513c34-8707-46dd-9b55-e953666df46c-config" (OuterVolumeSpecName: "config") pod "0f513c34-8707-46dd-9b55-e953666df46c" (UID: "0f513c34-8707-46dd-9b55-e953666df46c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.541536 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.549743 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f513c34-8707-46dd-9b55-e953666df46c-kube-api-access-sxrph" (OuterVolumeSpecName: "kube-api-access-sxrph") pod "0f513c34-8707-46dd-9b55-e953666df46c" (UID: "0f513c34-8707-46dd-9b55-e953666df46c"). InnerVolumeSpecName "kube-api-access-sxrph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.552711 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e986d513-8aa0-4908-b200-d6212f56cd0f-kube-api-access-p56ws" (OuterVolumeSpecName: "kube-api-access-p56ws") pod "e986d513-8aa0-4908-b200-d6212f56cd0f" (UID: "e986d513-8aa0-4908-b200-d6212f56cd0f"). InnerVolumeSpecName "kube-api-access-p56ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.571001 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.579613 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.604914 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.644069 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p56ws\" (UniqueName: \"kubernetes.io/projected/e986d513-8aa0-4908-b200-d6212f56cd0f-kube-api-access-p56ws\") on node \"crc\" DevicePath \"\"" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.644105 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxrph\" (UniqueName: \"kubernetes.io/projected/0f513c34-8707-46dd-9b55-e953666df46c-kube-api-access-sxrph\") on node \"crc\" DevicePath \"\"" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.644119 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.644132 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f513c34-8707-46dd-9b55-e953666df46c-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.669720 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.732386 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.790524 4816 generic.go:334] "Generic (PLEG): container finished" podID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerID="c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e" exitCode=0 Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.790629 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" event={"ID":"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5","Type":"ContainerDied","Data":"c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e"} Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.791864 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a22173f-147b-46ac-bb01-596fe9f12b10","Type":"ContainerStarted","Data":"ef8afb38cbe161f1b81f860d56715a732c9c137776bc40df909c84b5acbd4154"} Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.793813 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" event={"ID":"0f513c34-8707-46dd-9b55-e953666df46c","Type":"ContainerDied","Data":"34349f2681e98adca418f57aa55e4bcf6f5a91d14ddfb746d02fa6d79fb45869"} Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.793978 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.797630 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-bkgpq" event={"ID":"e986d513-8aa0-4908-b200-d6212f56cd0f","Type":"ContainerDied","Data":"6c939d71a23fbd96f7ac8915514c1d72476d3ae7287584fbe750ce02fb1ef302"} Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.798000 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.807528 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.874995 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-bkgpq"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.892450 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-bkgpq"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.908963 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-dbfn9"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.915225 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-dbfn9"] Mar 11 12:16:57 crc kubenswrapper[4816]: W0311 12:16:57.063442 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5030028c_f574_4334_a837_2430761524b4.slice/crio-6419a001ec72fffb18fae89ec5268f12610ae0c656da26d1ec1980d99bf8c731 WatchSource:0}: Error finding container 6419a001ec72fffb18fae89ec5268f12610ae0c656da26d1ec1980d99bf8c731: Status 404 returned error can't find the container with id 6419a001ec72fffb18fae89ec5268f12610ae0c656da26d1ec1980d99bf8c731 Mar 11 12:16:57 crc kubenswrapper[4816]: W0311 12:16:57.072162 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26aea2df_f497_478d_b953_060189ef2569.slice/crio-bd5f7144adb25f2d3d74b32cee4ef0069fc612e5f70830fc738cf8898c918056 WatchSource:0}: Error finding container bd5f7144adb25f2d3d74b32cee4ef0069fc612e5f70830fc738cf8898c918056: Status 404 returned error can't find the container with id bd5f7144adb25f2d3d74b32cee4ef0069fc612e5f70830fc738cf8898c918056 Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.701843 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tnhfq"] Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.808573 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8" event={"ID":"2de58390-335b-40cc-8461-d931d3b22e41","Type":"ContainerStarted","Data":"90ffa1dacc5321713c5d44a9d616add617a25ab1efffcadfb14af28f07cc7bbd"} Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.810307 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"da177cde-6332-4562-809a-d4bee453cebf","Type":"ContainerStarted","Data":"c1304c6acbe0151fcfd1f27a9fb0f616c29bb18a4876bb3def66924a603536ea"} Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.811778 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16e7d30-3235-44f2-81b4-c0c828071bbb","Type":"ContainerStarted","Data":"33dcb516fa17b7c432ef1e2b1650ba4d2e9f946dd76257f934af302a386a7dbf"} Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.812950 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26aea2df-f497-478d-b953-060189ef2569","Type":"ContainerStarted","Data":"bd5f7144adb25f2d3d74b32cee4ef0069fc612e5f70830fc738cf8898c918056"} Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.814157 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5030028c-f574-4334-a837-2430761524b4","Type":"ContainerStarted","Data":"6419a001ec72fffb18fae89ec5268f12610ae0c656da26d1ec1980d99bf8c731"} Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.815948 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3779c0f5-9084-4c07-83d9-fe2017559f7b","Type":"ContainerStarted","Data":"522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e"} Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.817137 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe419fb1-1901-4fd4-9d9c-8884651e3ad9","Type":"ContainerStarted","Data":"856ecaff8a78617160b7f62ce0d1169e3c52ef425eb093d777cccb4f585957a7"} Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.143663 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f513c34-8707-46dd-9b55-e953666df46c" path="/var/lib/kubelet/pods/0f513c34-8707-46dd-9b55-e953666df46c/volumes" Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.144039 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e986d513-8aa0-4908-b200-d6212f56cd0f" path="/var/lib/kubelet/pods/e986d513-8aa0-4908-b200-d6212f56cd0f/volumes" Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.828567 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" event={"ID":"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5","Type":"ContainerStarted","Data":"f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd"} Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.829106 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.833833 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tnhfq" event={"ID":"edc01aa4-013d-4d10-9f22-e5f319e6c1a3","Type":"ContainerStarted","Data":"22c727583d6de2eec899c37134713c754f06d9d2f697ad226095e328238d230b"} Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.836400 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8e9e4e8b-b60c-4c37-974a-8bdc1b243135","Type":"ContainerStarted","Data":"679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b"} Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.836693 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.850448 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" podStartSLOduration=4.809305165 podStartE2EDuration="24.850429136s" podCreationTimestamp="2026-03-11 12:16:34 +0000 UTC" firstStartedPulling="2026-03-11 12:16:35.714738437 +0000 UTC m=+1082.306002404" lastFinishedPulling="2026-03-11 12:16:55.755862408 +0000 UTC m=+1102.347126375" observedRunningTime="2026-03-11 12:16:58.845455392 +0000 UTC m=+1105.436719359" watchObservedRunningTime="2026-03-11 12:16:58.850429136 +0000 UTC m=+1105.441693093" Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.871674 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.483705633 podStartE2EDuration="18.871639333s" podCreationTimestamp="2026-03-11 12:16:40 +0000 UTC" firstStartedPulling="2026-03-11 12:16:55.674101673 +0000 UTC m=+1102.265365640" lastFinishedPulling="2026-03-11 12:16:58.062035373 +0000 UTC m=+1104.653299340" observedRunningTime="2026-03-11 12:16:58.86776016 +0000 UTC m=+1105.459024127" watchObservedRunningTime="2026-03-11 12:16:58.871639333 +0000 UTC m=+1105.462903320" Mar 11 12:16:59 crc kubenswrapper[4816]: I0311 12:16:59.846466 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26aea2df-f497-478d-b953-060189ef2569","Type":"ContainerStarted","Data":"47287b2bd213321105c729d451b069f02c0e309af3b5c9c84b7b9c24acc1a5f3"} Mar 11 12:17:03 crc kubenswrapper[4816]: I0311 12:17:03.882192 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a22173f-147b-46ac-bb01-596fe9f12b10","Type":"ContainerStarted","Data":"90224f5e31cd4408489a5dec30ffa77147f611b179c23e40a3d0104504542a1b"} Mar 11 12:17:03 crc kubenswrapper[4816]: I0311 12:17:03.886015 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"da177cde-6332-4562-809a-d4bee453cebf","Type":"ContainerStarted","Data":"933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03"} Mar 11 12:17:03 crc kubenswrapper[4816]: I0311 12:17:03.888990 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16e7d30-3235-44f2-81b4-c0c828071bbb","Type":"ContainerStarted","Data":"e36d52352569b57940dd2cebcd565fb31e6c049d444d2da7c54f0fe9d882c7f6"} Mar 11 12:17:03 crc kubenswrapper[4816]: I0311 12:17:03.891583 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5030028c-f574-4334-a837-2430761524b4","Type":"ContainerStarted","Data":"0b4c4c1c298f57878044bac49cc49a719acfc3a0f87a1803c19c539d85446637"} Mar 11 12:17:03 crc kubenswrapper[4816]: I0311 12:17:03.891741 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 11 12:17:03 crc kubenswrapper[4816]: I0311 12:17:03.895517 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe419fb1-1901-4fd4-9d9c-8884651e3ad9","Type":"ContainerStarted","Data":"ee8f2b910a2d52b32d76649fbccb57d3440b0a1d624504112ddbe71af6ca7889"} Mar 11 12:17:03 crc kubenswrapper[4816]: I0311 12:17:03.952833 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.521795739 podStartE2EDuration="25.952808831s" podCreationTimestamp="2026-03-11 12:16:38 +0000 UTC" firstStartedPulling="2026-03-11 12:16:57.069613513 +0000 UTC m=+1103.660877490" lastFinishedPulling="2026-03-11 12:17:03.500626615 +0000 UTC m=+1110.091890582" observedRunningTime="2026-03-11 12:17:03.950599567 +0000 UTC m=+1110.541863554" watchObservedRunningTime="2026-03-11 12:17:03.952808831 +0000 UTC m=+1110.544072798" Mar 11 12:17:04 crc kubenswrapper[4816]: I0311 12:17:04.922688 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8" event={"ID":"2de58390-335b-40cc-8461-d931d3b22e41","Type":"ContainerStarted","Data":"b67798b7f6eede8770ea6cbb3808f928e4bdbe9cdbf08abe0db324318159dd17"} Mar 11 12:17:04 crc kubenswrapper[4816]: I0311 12:17:04.924789 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-84rn8" Mar 11 12:17:04 crc kubenswrapper[4816]: I0311 12:17:04.929668 4816 generic.go:334] "Generic (PLEG): container finished" podID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerID="ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b" exitCode=0 Mar 11 12:17:04 crc kubenswrapper[4816]: I0311 12:17:04.930001 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tnhfq" event={"ID":"edc01aa4-013d-4d10-9f22-e5f319e6c1a3","Type":"ContainerDied","Data":"ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b"} Mar 11 12:17:04 crc kubenswrapper[4816]: I0311 12:17:04.963645 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-84rn8" podStartSLOduration=15.483000472 podStartE2EDuration="21.963624075s" podCreationTimestamp="2026-03-11 12:16:43 +0000 UTC" firstStartedPulling="2026-03-11 12:16:57.072503507 +0000 UTC m=+1103.663767484" lastFinishedPulling="2026-03-11 12:17:03.55312708 +0000 UTC m=+1110.144391087" observedRunningTime="2026-03-11 12:17:04.943571443 +0000 UTC m=+1111.534835410" watchObservedRunningTime="2026-03-11 12:17:04.963624075 +0000 UTC m=+1111.554888042" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.140429 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.213272 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-wc7mw"] Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.646277 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.750424 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-config\") pod \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.750597 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfcps\" (UniqueName: \"kubernetes.io/projected/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-kube-api-access-bfcps\") pod \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.750662 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-dns-svc\") pod \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.757906 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-kube-api-access-bfcps" (OuterVolumeSpecName: "kube-api-access-bfcps") pod "80db2c12-e3f3-4f0e-8201-435f1a0b27c5" (UID: "80db2c12-e3f3-4f0e-8201-435f1a0b27c5"). InnerVolumeSpecName "kube-api-access-bfcps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.773973 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "80db2c12-e3f3-4f0e-8201-435f1a0b27c5" (UID: "80db2c12-e3f3-4f0e-8201-435f1a0b27c5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.777471 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-config" (OuterVolumeSpecName: "config") pod "80db2c12-e3f3-4f0e-8201-435f1a0b27c5" (UID: "80db2c12-e3f3-4f0e-8201-435f1a0b27c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.852494 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfcps\" (UniqueName: \"kubernetes.io/projected/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-kube-api-access-bfcps\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.852540 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.852550 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.950298 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tnhfq" event={"ID":"edc01aa4-013d-4d10-9f22-e5f319e6c1a3","Type":"ContainerStarted","Data":"9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005"} Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.950352 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tnhfq" event={"ID":"edc01aa4-013d-4d10-9f22-e5f319e6c1a3","Type":"ContainerStarted","Data":"e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c"} Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.951950 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.951993 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" event={"ID":"80db2c12-e3f3-4f0e-8201-435f1a0b27c5","Type":"ContainerDied","Data":"d6fb7e37a836f700eb5acfa41a76a70f710c794efd150ca6ee3fb1323c24aa37"} Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.952057 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:17:06 crc kubenswrapper[4816]: I0311 12:17:05.997403 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tnhfq" podStartSLOduration=17.469383217 podStartE2EDuration="22.997381636s" podCreationTimestamp="2026-03-11 12:16:43 +0000 UTC" firstStartedPulling="2026-03-11 12:16:57.986475308 +0000 UTC m=+1104.577739315" lastFinishedPulling="2026-03-11 12:17:03.514473767 +0000 UTC m=+1110.105737734" observedRunningTime="2026-03-11 12:17:05.972014699 +0000 UTC m=+1112.563278686" watchObservedRunningTime="2026-03-11 12:17:05.997381636 +0000 UTC m=+1112.588645603" Mar 11 12:17:06 crc kubenswrapper[4816]: I0311 12:17:06.029192 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-wc7mw"] Mar 11 12:17:06 crc kubenswrapper[4816]: I0311 12:17:06.036153 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-wc7mw"] Mar 11 12:17:06 crc kubenswrapper[4816]: I0311 12:17:06.143146 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80db2c12-e3f3-4f0e-8201-435f1a0b27c5" path="/var/lib/kubelet/pods/80db2c12-e3f3-4f0e-8201-435f1a0b27c5/volumes" Mar 11 12:17:06 crc kubenswrapper[4816]: I0311 12:17:06.972624 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.471214 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-r8xbm"] Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.472172 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.474574 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.503403 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-r8xbm"] Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.595147 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovs-rundir\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.596634 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovn-rundir\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.596775 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lq9x\" (UniqueName: \"kubernetes.io/projected/91cdfd54-2ee7-490e-bf3f-563406e59cda-kube-api-access-7lq9x\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.596807 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-combined-ca-bundle\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.598969 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.599054 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.612882 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-fs99q"] Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.625950 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.632754 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.652561 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-fs99q"] Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702055 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26c46\" (UniqueName: \"kubernetes.io/projected/bc9d174a-14aa-42e4-bfc0-3b085e725504-kube-api-access-26c46\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702117 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lq9x\" (UniqueName: \"kubernetes.io/projected/91cdfd54-2ee7-490e-bf3f-563406e59cda-kube-api-access-7lq9x\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702147 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-combined-ca-bundle\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702185 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702218 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702267 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-dns-svc\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702296 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-ovsdbserver-nb\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702318 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovs-rundir\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702343 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovn-rundir\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702366 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-config\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.707177 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.707521 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovs-rundir\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.707589 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovn-rundir\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.717420 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.725706 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lq9x\" (UniqueName: \"kubernetes.io/projected/91cdfd54-2ee7-490e-bf3f-563406e59cda-kube-api-access-7lq9x\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.735652 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-combined-ca-bundle\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.798608 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.805040 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-dns-svc\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.805111 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-ovsdbserver-nb\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.805154 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-config\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.805213 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26c46\" (UniqueName: \"kubernetes.io/projected/bc9d174a-14aa-42e4-bfc0-3b085e725504-kube-api-access-26c46\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.805450 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-fs99q"] Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.806218 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-dns-svc\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.806712 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-config\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.807377 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-ovsdbserver-nb\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: E0311 12:17:07.807845 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-26c46 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" podUID="bc9d174a-14aa-42e4-bfc0-3b085e725504" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.841007 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26c46\" (UniqueName: \"kubernetes.io/projected/bc9d174a-14aa-42e4-bfc0-3b085e725504-kube-api-access-26c46\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.849281 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r4jqj"] Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.850605 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.859741 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.870702 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r4jqj"] Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.975019 4816 generic.go:334] "Generic (PLEG): container finished" podID="da177cde-6332-4562-809a-d4bee453cebf" containerID="933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03" exitCode=0 Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.975106 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"da177cde-6332-4562-809a-d4bee453cebf","Type":"ContainerDied","Data":"933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03"} Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.980589 4816 generic.go:334] "Generic (PLEG): container finished" podID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerID="90224f5e31cd4408489a5dec30ffa77147f611b179c23e40a3d0104504542a1b" exitCode=0 Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.980664 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.980641 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a22173f-147b-46ac-bb01-596fe9f12b10","Type":"ContainerDied","Data":"90224f5e31cd4408489a5dec30ffa77147f611b179c23e40a3d0104504542a1b"} Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.990206 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.013703 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.013766 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9npfx\" (UniqueName: \"kubernetes.io/projected/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-kube-api-access-9npfx\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.013823 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.014035 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-config\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.014238 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.115660 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26c46\" (UniqueName: \"kubernetes.io/projected/bc9d174a-14aa-42e4-bfc0-3b085e725504-kube-api-access-26c46\") pod \"bc9d174a-14aa-42e4-bfc0-3b085e725504\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.115837 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-ovsdbserver-nb\") pod \"bc9d174a-14aa-42e4-bfc0-3b085e725504\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.115941 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-config\") pod \"bc9d174a-14aa-42e4-bfc0-3b085e725504\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.115993 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-dns-svc\") pod \"bc9d174a-14aa-42e4-bfc0-3b085e725504\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.116311 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.116348 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9npfx\" (UniqueName: \"kubernetes.io/projected/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-kube-api-access-9npfx\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.116503 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc9d174a-14aa-42e4-bfc0-3b085e725504" (UID: "bc9d174a-14aa-42e4-bfc0-3b085e725504"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.116556 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-config" (OuterVolumeSpecName: "config") pod "bc9d174a-14aa-42e4-bfc0-3b085e725504" (UID: "bc9d174a-14aa-42e4-bfc0-3b085e725504"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.116901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.117043 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-config\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.117176 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.117410 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.117505 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bc9d174a-14aa-42e4-bfc0-3b085e725504" (UID: "bc9d174a-14aa-42e4-bfc0-3b085e725504"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.117822 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.118017 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-config\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.118674 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.130843 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9d174a-14aa-42e4-bfc0-3b085e725504-kube-api-access-26c46" (OuterVolumeSpecName: "kube-api-access-26c46") pod "bc9d174a-14aa-42e4-bfc0-3b085e725504" (UID: "bc9d174a-14aa-42e4-bfc0-3b085e725504"). InnerVolumeSpecName "kube-api-access-26c46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.135527 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9npfx\" (UniqueName: \"kubernetes.io/projected/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-kube-api-access-9npfx\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.197120 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.218641 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.218672 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.218681 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26c46\" (UniqueName: \"kubernetes.io/projected/bc9d174a-14aa-42e4-bfc0-3b085e725504-kube-api-access-26c46\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.218690 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.696649 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.991308 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:09 crc kubenswrapper[4816]: I0311 12:17:09.038860 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-fs99q"] Mar 11 12:17:09 crc kubenswrapper[4816]: I0311 12:17:09.044386 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-fs99q"] Mar 11 12:17:09 crc kubenswrapper[4816]: I0311 12:17:09.676527 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r4jqj"] Mar 11 12:17:09 crc kubenswrapper[4816]: W0311 12:17:09.685693 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3535eec4_3c32_4498_9c38_fbb7a5c77ee8.slice/crio-4c134446889cdcca3988e8e7afbf6fdb5eae635e176b54e4c5aea1b608efdbe5 WatchSource:0}: Error finding container 4c134446889cdcca3988e8e7afbf6fdb5eae635e176b54e4c5aea1b608efdbe5: Status 404 returned error can't find the container with id 4c134446889cdcca3988e8e7afbf6fdb5eae635e176b54e4c5aea1b608efdbe5 Mar 11 12:17:09 crc kubenswrapper[4816]: I0311 12:17:09.940492 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-r8xbm"] Mar 11 12:17:09 crc kubenswrapper[4816]: W0311 12:17:09.948589 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91cdfd54_2ee7_490e_bf3f_563406e59cda.slice/crio-f248cb3d03b08e499e3214d91a64d19cf5108c7a76d1c30f73bf2b55bdc66e0a WatchSource:0}: Error finding container f248cb3d03b08e499e3214d91a64d19cf5108c7a76d1c30f73bf2b55bdc66e0a: Status 404 returned error can't find the container with id f248cb3d03b08e499e3214d91a64d19cf5108c7a76d1c30f73bf2b55bdc66e0a Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.002350 4816 generic.go:334] "Generic (PLEG): container finished" podID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerID="a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922" exitCode=0 Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.002449 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" event={"ID":"3535eec4-3c32-4498-9c38-fbb7a5c77ee8","Type":"ContainerDied","Data":"a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922"} Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.002902 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" event={"ID":"3535eec4-3c32-4498-9c38-fbb7a5c77ee8","Type":"ContainerStarted","Data":"4c134446889cdcca3988e8e7afbf6fdb5eae635e176b54e4c5aea1b608efdbe5"} Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.005848 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe419fb1-1901-4fd4-9d9c-8884651e3ad9","Type":"ContainerStarted","Data":"4c01622c11d3f3812a2eae31ec2decc063cf1fe9d275e29cfb942cdc480ba8db"} Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.008585 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a22173f-147b-46ac-bb01-596fe9f12b10","Type":"ContainerStarted","Data":"08358819a244a822957b7c7153f37ef3fa2c0371fe913be221e0cf6e09e89054"} Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.010982 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"da177cde-6332-4562-809a-d4bee453cebf","Type":"ContainerStarted","Data":"c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492"} Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.012910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-r8xbm" event={"ID":"91cdfd54-2ee7-490e-bf3f-563406e59cda","Type":"ContainerStarted","Data":"f248cb3d03b08e499e3214d91a64d19cf5108c7a76d1c30f73bf2b55bdc66e0a"} Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.015095 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16e7d30-3235-44f2-81b4-c0c828071bbb","Type":"ContainerStarted","Data":"5e227ce28f5de77017097c97e0a28037dfd14090da88c0fa20d1f53e10f8268b"} Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.059581 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.066173 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.786248141 podStartE2EDuration="24.066136405s" podCreationTimestamp="2026-03-11 12:16:46 +0000 UTC" firstStartedPulling="2026-03-11 12:16:57.094893457 +0000 UTC m=+1103.686157424" lastFinishedPulling="2026-03-11 12:17:09.374781721 +0000 UTC m=+1115.966045688" observedRunningTime="2026-03-11 12:17:10.052435417 +0000 UTC m=+1116.643699384" watchObservedRunningTime="2026-03-11 12:17:10.066136405 +0000 UTC m=+1116.657400372" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.083045 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=14.839052235 podStartE2EDuration="27.083019656s" podCreationTimestamp="2026-03-11 12:16:43 +0000 UTC" firstStartedPulling="2026-03-11 12:16:57.073695521 +0000 UTC m=+1103.664959498" lastFinishedPulling="2026-03-11 12:17:09.317662952 +0000 UTC m=+1115.908926919" observedRunningTime="2026-03-11 12:17:10.078625328 +0000 UTC m=+1116.669889295" watchObservedRunningTime="2026-03-11 12:17:10.083019656 +0000 UTC m=+1116.674283623" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.113894 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=28.666554956 podStartE2EDuration="35.113870742s" podCreationTimestamp="2026-03-11 12:16:35 +0000 UTC" firstStartedPulling="2026-03-11 12:16:57.10876452 +0000 UTC m=+1103.700028487" lastFinishedPulling="2026-03-11 12:17:03.556080306 +0000 UTC m=+1110.147344273" observedRunningTime="2026-03-11 12:17:10.109958398 +0000 UTC m=+1116.701222365" watchObservedRunningTime="2026-03-11 12:17:10.113870742 +0000 UTC m=+1116.705134709" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.146130 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.236173883 podStartE2EDuration="33.146105498s" podCreationTimestamp="2026-03-11 12:16:37 +0000 UTC" firstStartedPulling="2026-03-11 12:16:56.644024669 +0000 UTC m=+1103.235288636" lastFinishedPulling="2026-03-11 12:17:03.553956284 +0000 UTC m=+1110.145220251" observedRunningTime="2026-03-11 12:17:10.142679159 +0000 UTC m=+1116.733943126" watchObservedRunningTime="2026-03-11 12:17:10.146105498 +0000 UTC m=+1116.737369465" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.153294 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc9d174a-14aa-42e4-bfc0-3b085e725504" path="/var/lib/kubelet/pods/bc9d174a-14aa-42e4-bfc0-3b085e725504/volumes" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.717242 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r4jqj"] Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.749333 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.750012 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-wqn2t"] Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.751432 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.774832 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-wqn2t"] Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.793546 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-config\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.793615 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwfrm\" (UniqueName: \"kubernetes.io/projected/bcc1a78b-c3d2-4c15-81a0-0431da953e51-kube-api-access-jwfrm\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.793671 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.795200 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.795271 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.896822 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.896878 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.896942 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-config\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.896969 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwfrm\" (UniqueName: \"kubernetes.io/projected/bcc1a78b-c3d2-4c15-81a0-0431da953e51-kube-api-access-jwfrm\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.896996 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.900043 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.900970 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.901094 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-config\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.901102 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.920778 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwfrm\" (UniqueName: \"kubernetes.io/projected/bcc1a78b-c3d2-4c15-81a0-0431da953e51-kube-api-access-jwfrm\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.029418 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-r8xbm" event={"ID":"91cdfd54-2ee7-490e-bf3f-563406e59cda","Type":"ContainerStarted","Data":"be5c0e05e1987846058e7b0cb0a3139e1568599a10f5067e16f3de74b6995fb8"} Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.034357 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" event={"ID":"3535eec4-3c32-4498-9c38-fbb7a5c77ee8","Type":"ContainerStarted","Data":"38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca"} Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.034583 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.086927 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-r8xbm" podStartSLOduration=4.086906268 podStartE2EDuration="4.086906268s" podCreationTimestamp="2026-03-11 12:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:11.063074346 +0000 UTC m=+1117.654338313" watchObservedRunningTime="2026-03-11 12:17:11.086906268 +0000 UTC m=+1117.678170255" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.089601 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" podStartSLOduration=4.089585536 podStartE2EDuration="4.089585536s" podCreationTimestamp="2026-03-11 12:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:11.086158926 +0000 UTC m=+1117.677422893" watchObservedRunningTime="2026-03-11 12:17:11.089585536 +0000 UTC m=+1117.680849503" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.098938 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:11 crc kubenswrapper[4816]: W0311 12:17:11.585150 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcc1a78b_c3d2_4c15_81a0_0431da953e51.slice/crio-ba3c97adc7cc798d326f3771649f02fd21d888d95dfd0aad666803c28f5b240b WatchSource:0}: Error finding container ba3c97adc7cc798d326f3771649f02fd21d888d95dfd0aad666803c28f5b240b: Status 404 returned error can't find the container with id ba3c97adc7cc798d326f3771649f02fd21d888d95dfd0aad666803c28f5b240b Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.589058 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-wqn2t"] Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.785447 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.819152 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.821880 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.829922 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.830426 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.831442 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-w2jgz" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.833416 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.848989 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.880499 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.915869 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-lock\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.915934 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbb5r\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-kube-api-access-rbb5r\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.916014 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-cache\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.916074 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.916093 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485f9fbd-e0ca-472d-b97c-87c127253a96-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.916123 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.017938 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbb5r\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-kube-api-access-rbb5r\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.018373 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-cache\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.018565 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.018664 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485f9fbd-e0ca-472d-b97c-87c127253a96-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: E0311 12:17:12.018723 4816 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 12:17:12 crc kubenswrapper[4816]: E0311 12:17:12.018758 4816 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.018775 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-cache\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.018901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.019000 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-lock\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.019371 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.019606 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-lock\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: E0311 12:17:12.019729 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift podName:485f9fbd-e0ca-472d-b97c-87c127253a96 nodeName:}" failed. No retries permitted until 2026-03-11 12:17:12.519698906 +0000 UTC m=+1119.110962873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift") pod "swift-storage-0" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96") : configmap "swift-ring-files" not found Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.028165 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485f9fbd-e0ca-472d-b97c-87c127253a96-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.037191 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbb5r\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-kube-api-access-rbb5r\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.044387 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.044627 4816 generic.go:334] "Generic (PLEG): container finished" podID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerID="035e208fb3e5fc9b968f1db57d46e9bd63d57178d448cbc27d1282a58427f605" exitCode=0 Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.044727 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" event={"ID":"bcc1a78b-c3d2-4c15-81a0-0431da953e51","Type":"ContainerDied","Data":"035e208fb3e5fc9b968f1db57d46e9bd63d57178d448cbc27d1282a58427f605"} Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.044796 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" event={"ID":"bcc1a78b-c3d2-4c15-81a0-0431da953e51","Type":"ContainerStarted","Data":"ba3c97adc7cc798d326f3771649f02fd21d888d95dfd0aad666803c28f5b240b"} Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.045360 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" podUID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerName="dnsmasq-dns" containerID="cri-o://38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca" gracePeriod=10 Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.045919 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.059429 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.110699 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.114590 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 11 12:17:12 crc kubenswrapper[4816]: E0311 12:17:12.256899 4816 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.94:43420->38.102.83.94:46473: write tcp 38.102.83.94:43420->38.102.83.94:46473: write: broken pipe Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.371374 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-9nggr"] Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.372818 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.376849 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.377196 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.377389 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.398923 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9nggr"] Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527499 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-swiftconf\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527569 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7fca72cd-9caa-4029-8c20-1623a315702d-etc-swift\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527613 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-combined-ca-bundle\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527667 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-scripts\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527690 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-ring-data-devices\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527738 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527761 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvz42\" (UniqueName: \"kubernetes.io/projected/7fca72cd-9caa-4029-8c20-1623a315702d-kube-api-access-kvz42\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527785 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-dispersionconf\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: E0311 12:17:12.528000 4816 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 12:17:12 crc kubenswrapper[4816]: E0311 12:17:12.528017 4816 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 12:17:12 crc kubenswrapper[4816]: E0311 12:17:12.528058 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift podName:485f9fbd-e0ca-472d-b97c-87c127253a96 nodeName:}" failed. No retries permitted until 2026-03-11 12:17:13.528043514 +0000 UTC m=+1120.119307481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift") pod "swift-storage-0" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96") : configmap "swift-ring-files" not found Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.617735 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.630854 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvz42\" (UniqueName: \"kubernetes.io/projected/7fca72cd-9caa-4029-8c20-1623a315702d-kube-api-access-kvz42\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.630911 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-dispersionconf\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.630981 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-swiftconf\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.631052 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7fca72cd-9caa-4029-8c20-1623a315702d-etc-swift\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.631097 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-combined-ca-bundle\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.631142 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-scripts\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.631165 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-ring-data-devices\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.631943 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-ring-data-devices\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.632430 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-scripts\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.632949 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7fca72cd-9caa-4029-8c20-1623a315702d-etc-swift\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.641567 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-dispersionconf\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.646119 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-combined-ca-bundle\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.657535 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-swiftconf\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.665367 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvz42\" (UniqueName: \"kubernetes.io/projected/7fca72cd-9caa-4029-8c20-1623a315702d-kube-api-access-kvz42\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.709783 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.732363 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-config\") pod \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.732444 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-nb\") pod \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.732501 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-dns-svc\") pod \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.732688 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-sb\") pod \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.732730 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9npfx\" (UniqueName: \"kubernetes.io/projected/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-kube-api-access-9npfx\") pod \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.741566 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-kube-api-access-9npfx" (OuterVolumeSpecName: "kube-api-access-9npfx") pod "3535eec4-3c32-4498-9c38-fbb7a5c77ee8" (UID: "3535eec4-3c32-4498-9c38-fbb7a5c77ee8"). InnerVolumeSpecName "kube-api-access-9npfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.826526 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3535eec4-3c32-4498-9c38-fbb7a5c77ee8" (UID: "3535eec4-3c32-4498-9c38-fbb7a5c77ee8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.828557 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3535eec4-3c32-4498-9c38-fbb7a5c77ee8" (UID: "3535eec4-3c32-4498-9c38-fbb7a5c77ee8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.829596 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3535eec4-3c32-4498-9c38-fbb7a5c77ee8" (UID: "3535eec4-3c32-4498-9c38-fbb7a5c77ee8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.835201 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.835236 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.835279 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.835297 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9npfx\" (UniqueName: \"kubernetes.io/projected/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-kube-api-access-9npfx\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.850081 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-config" (OuterVolumeSpecName: "config") pod "3535eec4-3c32-4498-9c38-fbb7a5c77ee8" (UID: "3535eec4-3c32-4498-9c38-fbb7a5c77ee8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.937434 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.059125 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" event={"ID":"bcc1a78b-c3d2-4c15-81a0-0431da953e51","Type":"ContainerStarted","Data":"f1a234613505f291637cb739619dbef7845308ac22057594b971bae3924f2dc7"} Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.060516 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.064294 4816 generic.go:334] "Generic (PLEG): container finished" podID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerID="38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca" exitCode=0 Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.064349 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" event={"ID":"3535eec4-3c32-4498-9c38-fbb7a5c77ee8","Type":"ContainerDied","Data":"38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca"} Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.064416 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.073704 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" event={"ID":"3535eec4-3c32-4498-9c38-fbb7a5c77ee8","Type":"ContainerDied","Data":"4c134446889cdcca3988e8e7afbf6fdb5eae635e176b54e4c5aea1b608efdbe5"} Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.073873 4816 scope.go:117] "RemoveContainer" containerID="38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.080182 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" podStartSLOduration=3.080158443 podStartE2EDuration="3.080158443s" podCreationTimestamp="2026-03-11 12:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:13.07798999 +0000 UTC m=+1119.669253957" watchObservedRunningTime="2026-03-11 12:17:13.080158443 +0000 UTC m=+1119.671422410" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.105132 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r4jqj"] Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.110891 4816 scope.go:117] "RemoveContainer" containerID="a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.111308 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r4jqj"] Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.126330 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.131745 4816 scope.go:117] "RemoveContainer" containerID="38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca" Mar 11 12:17:13 crc kubenswrapper[4816]: E0311 12:17:13.132346 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca\": container with ID starting with 38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca not found: ID does not exist" containerID="38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.132380 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca"} err="failed to get container status \"38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca\": rpc error: code = NotFound desc = could not find container \"38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca\": container with ID starting with 38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca not found: ID does not exist" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.132404 4816 scope.go:117] "RemoveContainer" containerID="a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922" Mar 11 12:17:13 crc kubenswrapper[4816]: E0311 12:17:13.132849 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922\": container with ID starting with a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922 not found: ID does not exist" containerID="a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.132892 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922"} err="failed to get container status \"a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922\": rpc error: code = NotFound desc = could not find container \"a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922\": container with ID starting with a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922 not found: ID does not exist" Mar 11 12:17:13 crc kubenswrapper[4816]: W0311 12:17:13.323399 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fca72cd_9caa_4029_8c20_1623a315702d.slice/crio-33b17ac615d74325a9263091c6d521ccba5681421913cd3808e6a592677fe4c5 WatchSource:0}: Error finding container 33b17ac615d74325a9263091c6d521ccba5681421913cd3808e6a592677fe4c5: Status 404 returned error can't find the container with id 33b17ac615d74325a9263091c6d521ccba5681421913cd3808e6a592677fe4c5 Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.323610 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9nggr"] Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.337885 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 11 12:17:13 crc kubenswrapper[4816]: E0311 12:17:13.338421 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerName="init" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.338446 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerName="init" Mar 11 12:17:13 crc kubenswrapper[4816]: E0311 12:17:13.338480 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerName="dnsmasq-dns" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.338487 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerName="dnsmasq-dns" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.338700 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerName="dnsmasq-dns" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.339704 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.345815 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.346147 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.346547 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.346771 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.347497 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-chwg7" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.448218 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.448373 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.448404 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hft4\" (UniqueName: \"kubernetes.io/projected/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-kube-api-access-7hft4\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.448429 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-scripts\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.448469 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.448494 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.448556 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-config\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.550851 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-scripts\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.550935 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.550963 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.551001 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.551029 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-config\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.551090 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.551182 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.551212 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hft4\" (UniqueName: \"kubernetes.io/projected/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-kube-api-access-7hft4\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: E0311 12:17:13.551556 4816 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 12:17:13 crc kubenswrapper[4816]: E0311 12:17:13.551633 4816 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 12:17:13 crc kubenswrapper[4816]: E0311 12:17:13.551803 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift podName:485f9fbd-e0ca-472d-b97c-87c127253a96 nodeName:}" failed. No retries permitted until 2026-03-11 12:17:15.551746543 +0000 UTC m=+1122.143010650 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift") pod "swift-storage-0" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96") : configmap "swift-ring-files" not found Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.552013 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.552075 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-scripts\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.552628 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-config\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.560099 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.560352 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.560531 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.575083 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hft4\" (UniqueName: \"kubernetes.io/projected/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-kube-api-access-7hft4\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.667571 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 12:17:14 crc kubenswrapper[4816]: I0311 12:17:14.088705 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9nggr" event={"ID":"7fca72cd-9caa-4029-8c20-1623a315702d","Type":"ContainerStarted","Data":"33b17ac615d74325a9263091c6d521ccba5681421913cd3808e6a592677fe4c5"} Mar 11 12:17:14 crc kubenswrapper[4816]: I0311 12:17:14.143709 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" path="/var/lib/kubelet/pods/3535eec4-3c32-4498-9c38-fbb7a5c77ee8/volumes" Mar 11 12:17:14 crc kubenswrapper[4816]: I0311 12:17:14.217519 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 12:17:15 crc kubenswrapper[4816]: I0311 12:17:15.099119 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825","Type":"ContainerStarted","Data":"25d4f9ece0205331680bd83d3d312fa201b0497bc9a8a61346652664c99b99e2"} Mar 11 12:17:15 crc kubenswrapper[4816]: I0311 12:17:15.593379 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:15 crc kubenswrapper[4816]: E0311 12:17:15.593572 4816 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 12:17:15 crc kubenswrapper[4816]: E0311 12:17:15.593598 4816 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 12:17:15 crc kubenswrapper[4816]: E0311 12:17:15.593652 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift podName:485f9fbd-e0ca-472d-b97c-87c127253a96 nodeName:}" failed. No retries permitted until 2026-03-11 12:17:19.59363499 +0000 UTC m=+1126.184898957 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift") pod "swift-storage-0" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96") : configmap "swift-ring-files" not found Mar 11 12:17:16 crc kubenswrapper[4816]: I0311 12:17:16.916566 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 11 12:17:16 crc kubenswrapper[4816]: I0311 12:17:16.916903 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 11 12:17:17 crc kubenswrapper[4816]: I0311 12:17:17.007974 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 11 12:17:17 crc kubenswrapper[4816]: I0311 12:17:17.221786 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.129386 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825","Type":"ContainerStarted","Data":"6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7"} Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.129961 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825","Type":"ContainerStarted","Data":"8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7"} Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.144042 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9nggr" event={"ID":"7fca72cd-9caa-4029-8c20-1623a315702d","Type":"ContainerStarted","Data":"c460fb14090c9d550203cf386e04b06e3563514702df072e42ad5fc80f7e1872"} Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.167193 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-9nggr" podStartSLOduration=2.014346273 podStartE2EDuration="6.167153622s" podCreationTimestamp="2026-03-11 12:17:12 +0000 UTC" firstStartedPulling="2026-03-11 12:17:13.329079115 +0000 UTC m=+1119.920343072" lastFinishedPulling="2026-03-11 12:17:17.481886454 +0000 UTC m=+1124.073150421" observedRunningTime="2026-03-11 12:17:18.162065554 +0000 UTC m=+1124.753329531" watchObservedRunningTime="2026-03-11 12:17:18.167153622 +0000 UTC m=+1124.758417589" Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.458533 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.458759 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.551177 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.941218 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c8d3-account-create-update-85zqd"] Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.945510 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.949468 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.971038 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c8d3-account-create-update-85zqd"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:18.999973 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-l5lds"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.001732 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.013705 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-l5lds"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.069610 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288dd774-6e04-45d2-b786-c7f2be7fbeae-operator-scripts\") pod \"glance-c8d3-account-create-update-85zqd\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.069683 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f9qn\" (UniqueName: \"kubernetes.io/projected/288dd774-6e04-45d2-b786-c7f2be7fbeae-kube-api-access-2f9qn\") pod \"glance-c8d3-account-create-update-85zqd\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.069769 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fd32333-bdaa-461b-ac10-324291d1e5d3-operator-scripts\") pod \"glance-db-create-l5lds\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.069925 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcqxt\" (UniqueName: \"kubernetes.io/projected/9fd32333-bdaa-461b-ac10-324291d1e5d3-kube-api-access-mcqxt\") pod \"glance-db-create-l5lds\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.167052 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.904639905 podStartE2EDuration="6.167033739s" podCreationTimestamp="2026-03-11 12:17:13 +0000 UTC" firstStartedPulling="2026-03-11 12:17:14.221515569 +0000 UTC m=+1120.812779536" lastFinishedPulling="2026-03-11 12:17:17.483909403 +0000 UTC m=+1124.075173370" observedRunningTime="2026-03-11 12:17:19.160210081 +0000 UTC m=+1125.751474048" watchObservedRunningTime="2026-03-11 12:17:19.167033739 +0000 UTC m=+1125.758297706" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.171951 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288dd774-6e04-45d2-b786-c7f2be7fbeae-operator-scripts\") pod \"glance-c8d3-account-create-update-85zqd\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.172433 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f9qn\" (UniqueName: \"kubernetes.io/projected/288dd774-6e04-45d2-b786-c7f2be7fbeae-kube-api-access-2f9qn\") pod \"glance-c8d3-account-create-update-85zqd\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.172533 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fd32333-bdaa-461b-ac10-324291d1e5d3-operator-scripts\") pod \"glance-db-create-l5lds\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.172576 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcqxt\" (UniqueName: \"kubernetes.io/projected/9fd32333-bdaa-461b-ac10-324291d1e5d3-kube-api-access-mcqxt\") pod \"glance-db-create-l5lds\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.172820 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288dd774-6e04-45d2-b786-c7f2be7fbeae-operator-scripts\") pod \"glance-c8d3-account-create-update-85zqd\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.176454 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fd32333-bdaa-461b-ac10-324291d1e5d3-operator-scripts\") pod \"glance-db-create-l5lds\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.201160 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcqxt\" (UniqueName: \"kubernetes.io/projected/9fd32333-bdaa-461b-ac10-324291d1e5d3-kube-api-access-mcqxt\") pod \"glance-db-create-l5lds\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.201693 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f9qn\" (UniqueName: \"kubernetes.io/projected/288dd774-6e04-45d2-b786-c7f2be7fbeae-kube-api-access-2f9qn\") pod \"glance-c8d3-account-create-update-85zqd\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.234337 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.276607 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.324963 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.503763 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-rmcqp"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.506299 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.523149 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rmcqp"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.589038 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bccjt\" (UniqueName: \"kubernetes.io/projected/742cfc03-0365-4df8-a7f6-e6eac11ba045-kube-api-access-bccjt\") pod \"keystone-db-create-rmcqp\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.589125 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/742cfc03-0365-4df8-a7f6-e6eac11ba045-operator-scripts\") pod \"keystone-db-create-rmcqp\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.613357 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9b21-account-create-update-r8vgg"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.614539 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.620398 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.631471 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9b21-account-create-update-r8vgg"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.691018 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2mbc\" (UniqueName: \"kubernetes.io/projected/625b367b-084e-4cf8-8c30-5d4df9c696f9-kube-api-access-c2mbc\") pod \"keystone-9b21-account-create-update-r8vgg\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.691085 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bccjt\" (UniqueName: \"kubernetes.io/projected/742cfc03-0365-4df8-a7f6-e6eac11ba045-kube-api-access-bccjt\") pod \"keystone-db-create-rmcqp\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.691153 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/742cfc03-0365-4df8-a7f6-e6eac11ba045-operator-scripts\") pod \"keystone-db-create-rmcqp\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.691420 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.691468 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625b367b-084e-4cf8-8c30-5d4df9c696f9-operator-scripts\") pod \"keystone-9b21-account-create-update-r8vgg\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: E0311 12:17:19.691870 4816 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 12:17:19 crc kubenswrapper[4816]: E0311 12:17:19.691900 4816 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 12:17:19 crc kubenswrapper[4816]: E0311 12:17:19.691970 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift podName:485f9fbd-e0ca-472d-b97c-87c127253a96 nodeName:}" failed. No retries permitted until 2026-03-11 12:17:27.691947188 +0000 UTC m=+1134.283211155 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift") pod "swift-storage-0" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96") : configmap "swift-ring-files" not found Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.692504 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/742cfc03-0365-4df8-a7f6-e6eac11ba045-operator-scripts\") pod \"keystone-db-create-rmcqp\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.718941 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bccjt\" (UniqueName: \"kubernetes.io/projected/742cfc03-0365-4df8-a7f6-e6eac11ba045-kube-api-access-bccjt\") pod \"keystone-db-create-rmcqp\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.727750 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-85nd9"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.729896 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-85nd9" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.749680 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-85nd9"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.793618 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d3e7fa1-3f66-495b-be44-cf97eec043c1-operator-scripts\") pod \"placement-db-create-85nd9\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " pod="openstack/placement-db-create-85nd9" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.793682 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2mbc\" (UniqueName: \"kubernetes.io/projected/625b367b-084e-4cf8-8c30-5d4df9c696f9-kube-api-access-c2mbc\") pod \"keystone-9b21-account-create-update-r8vgg\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.793737 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4p6b\" (UniqueName: \"kubernetes.io/projected/8d3e7fa1-3f66-495b-be44-cf97eec043c1-kube-api-access-x4p6b\") pod \"placement-db-create-85nd9\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " pod="openstack/placement-db-create-85nd9" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.793859 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625b367b-084e-4cf8-8c30-5d4df9c696f9-operator-scripts\") pod \"keystone-9b21-account-create-update-r8vgg\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.794686 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625b367b-084e-4cf8-8c30-5d4df9c696f9-operator-scripts\") pod \"keystone-9b21-account-create-update-r8vgg\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.798675 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3c3c-account-create-update-2whdq"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.799845 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.801961 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.813886 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3c3c-account-create-update-2whdq"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.816668 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2mbc\" (UniqueName: \"kubernetes.io/projected/625b367b-084e-4cf8-8c30-5d4df9c696f9-kube-api-access-c2mbc\") pod \"keystone-9b21-account-create-update-r8vgg\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.851539 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.858211 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c8d3-account-create-update-85zqd"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.895999 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d3e7fa1-3f66-495b-be44-cf97eec043c1-operator-scripts\") pod \"placement-db-create-85nd9\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " pod="openstack/placement-db-create-85nd9" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.896094 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4p6b\" (UniqueName: \"kubernetes.io/projected/8d3e7fa1-3f66-495b-be44-cf97eec043c1-kube-api-access-x4p6b\") pod \"placement-db-create-85nd9\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " pod="openstack/placement-db-create-85nd9" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.896146 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/632c5d32-5370-401a-8202-58e0ec70f357-operator-scripts\") pod \"placement-3c3c-account-create-update-2whdq\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.896191 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk5p5\" (UniqueName: \"kubernetes.io/projected/632c5d32-5370-401a-8202-58e0ec70f357-kube-api-access-gk5p5\") pod \"placement-3c3c-account-create-update-2whdq\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.897360 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d3e7fa1-3f66-495b-be44-cf97eec043c1-operator-scripts\") pod \"placement-db-create-85nd9\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " pod="openstack/placement-db-create-85nd9" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.916376 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4p6b\" (UniqueName: \"kubernetes.io/projected/8d3e7fa1-3f66-495b-be44-cf97eec043c1-kube-api-access-x4p6b\") pod \"placement-db-create-85nd9\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " pod="openstack/placement-db-create-85nd9" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.945193 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.998585 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/632c5d32-5370-401a-8202-58e0ec70f357-operator-scripts\") pod \"placement-3c3c-account-create-update-2whdq\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.998665 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk5p5\" (UniqueName: \"kubernetes.io/projected/632c5d32-5370-401a-8202-58e0ec70f357-kube-api-access-gk5p5\") pod \"placement-3c3c-account-create-update-2whdq\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.000197 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/632c5d32-5370-401a-8202-58e0ec70f357-operator-scripts\") pod \"placement-3c3c-account-create-update-2whdq\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.022922 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-l5lds"] Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.049952 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk5p5\" (UniqueName: \"kubernetes.io/projected/632c5d32-5370-401a-8202-58e0ec70f357-kube-api-access-gk5p5\") pod \"placement-3c3c-account-create-update-2whdq\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.089764 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-85nd9" Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.119394 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.162897 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-l5lds" event={"ID":"9fd32333-bdaa-461b-ac10-324291d1e5d3","Type":"ContainerStarted","Data":"3ceb41c8ea2fba7175551e2e2e287690c5e419936c11d2017fa5a383da9d61fd"} Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.170805 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c8d3-account-create-update-85zqd" event={"ID":"288dd774-6e04-45d2-b786-c7f2be7fbeae","Type":"ContainerStarted","Data":"5140353c5c6034db1623dc2f3c189d72ec962703a0a91d22d2e279ead073afac"} Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.170894 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c8d3-account-create-update-85zqd" event={"ID":"288dd774-6e04-45d2-b786-c7f2be7fbeae","Type":"ContainerStarted","Data":"d08403da7f71b924e48ef9d0d5d10621dca1ae09e43b0f9bc4c8d1ae6cf47de1"} Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.196069 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-c8d3-account-create-update-85zqd" podStartSLOduration=2.196010802 podStartE2EDuration="2.196010802s" podCreationTimestamp="2026-03-11 12:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:20.190592654 +0000 UTC m=+1126.781856631" watchObservedRunningTime="2026-03-11 12:17:20.196010802 +0000 UTC m=+1126.787274769" Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.366320 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rmcqp"] Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.584802 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9b21-account-create-update-r8vgg"] Mar 11 12:17:20 crc kubenswrapper[4816]: W0311 12:17:20.607344 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod625b367b_084e_4cf8_8c30_5d4df9c696f9.slice/crio-b1d1b3a696ed28345a8a58187f340852c23ee5c0f7a319f65f9096cd2efaa080 WatchSource:0}: Error finding container b1d1b3a696ed28345a8a58187f340852c23ee5c0f7a319f65f9096cd2efaa080: Status 404 returned error can't find the container with id b1d1b3a696ed28345a8a58187f340852c23ee5c0f7a319f65f9096cd2efaa080 Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.686294 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-85nd9"] Mar 11 12:17:20 crc kubenswrapper[4816]: W0311 12:17:20.742633 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d3e7fa1_3f66_495b_be44_cf97eec043c1.slice/crio-325c8645905fb2092da099ab41a3a90f525a5fd495df91580bbe9c8dfd427b77 WatchSource:0}: Error finding container 325c8645905fb2092da099ab41a3a90f525a5fd495df91580bbe9c8dfd427b77: Status 404 returned error can't find the container with id 325c8645905fb2092da099ab41a3a90f525a5fd495df91580bbe9c8dfd427b77 Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.842441 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3c3c-account-create-update-2whdq"] Mar 11 12:17:20 crc kubenswrapper[4816]: W0311 12:17:20.846279 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod632c5d32_5370_401a_8202_58e0ec70f357.slice/crio-b82ed4a76542786e4b16c41e7a02f7bd83269f45c60c92a6fba6442f53946ac7 WatchSource:0}: Error finding container b82ed4a76542786e4b16c41e7a02f7bd83269f45c60c92a6fba6442f53946ac7: Status 404 returned error can't find the container with id b82ed4a76542786e4b16c41e7a02f7bd83269f45c60c92a6fba6442f53946ac7 Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.101507 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.186534 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rmcqp" event={"ID":"742cfc03-0365-4df8-a7f6-e6eac11ba045","Type":"ContainerStarted","Data":"d6e7d3be2f695e55ef6abf84c83d060683eae93e0020c12fe8744829cbcc1d6a"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.187085 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rmcqp" event={"ID":"742cfc03-0365-4df8-a7f6-e6eac11ba045","Type":"ContainerStarted","Data":"72f8ee81a2c1316c277ebe03c1377b981929e7b088dea5cad65f5821f4e7a02b"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.189482 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-ngbb2"] Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.189730 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" podUID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerName="dnsmasq-dns" containerID="cri-o://f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd" gracePeriod=10 Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.197702 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c3c-account-create-update-2whdq" event={"ID":"632c5d32-5370-401a-8202-58e0ec70f357","Type":"ContainerStarted","Data":"8c240088bce92d648a44cfc826778c591f6601fbc70cdbc9325a1348704e1a92"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.197781 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c3c-account-create-update-2whdq" event={"ID":"632c5d32-5370-401a-8202-58e0ec70f357","Type":"ContainerStarted","Data":"b82ed4a76542786e4b16c41e7a02f7bd83269f45c60c92a6fba6442f53946ac7"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.216799 4816 generic.go:334] "Generic (PLEG): container finished" podID="9fd32333-bdaa-461b-ac10-324291d1e5d3" containerID="a8f8ba02ac608528a8da635158a48ff55377bd4734bbd746e513b637d5d907d3" exitCode=0 Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.216918 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-l5lds" event={"ID":"9fd32333-bdaa-461b-ac10-324291d1e5d3","Type":"ContainerDied","Data":"a8f8ba02ac608528a8da635158a48ff55377bd4734bbd746e513b637d5d907d3"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.229621 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-85nd9" event={"ID":"8d3e7fa1-3f66-495b-be44-cf97eec043c1","Type":"ContainerStarted","Data":"ab6525891e160f8b83901124157238a30564c85220f9440c25fb3222634839c7"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.229719 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-85nd9" event={"ID":"8d3e7fa1-3f66-495b-be44-cf97eec043c1","Type":"ContainerStarted","Data":"325c8645905fb2092da099ab41a3a90f525a5fd495df91580bbe9c8dfd427b77"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.241665 4816 generic.go:334] "Generic (PLEG): container finished" podID="288dd774-6e04-45d2-b786-c7f2be7fbeae" containerID="5140353c5c6034db1623dc2f3c189d72ec962703a0a91d22d2e279ead073afac" exitCode=0 Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.241746 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c8d3-account-create-update-85zqd" event={"ID":"288dd774-6e04-45d2-b786-c7f2be7fbeae","Type":"ContainerDied","Data":"5140353c5c6034db1623dc2f3c189d72ec962703a0a91d22d2e279ead073afac"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.246925 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-rmcqp" podStartSLOduration=2.246893749 podStartE2EDuration="2.246893749s" podCreationTimestamp="2026-03-11 12:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:21.232723508 +0000 UTC m=+1127.823987475" watchObservedRunningTime="2026-03-11 12:17:21.246893749 +0000 UTC m=+1127.838157716" Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.263104 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9b21-account-create-update-r8vgg" event={"ID":"625b367b-084e-4cf8-8c30-5d4df9c696f9","Type":"ContainerStarted","Data":"d63f60636f6e53982a24004e405c34ed67500a9193f04b98e8d29856c8e89ee2"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.263165 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9b21-account-create-update-r8vgg" event={"ID":"625b367b-084e-4cf8-8c30-5d4df9c696f9","Type":"ContainerStarted","Data":"b1d1b3a696ed28345a8a58187f340852c23ee5c0f7a319f65f9096cd2efaa080"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.271396 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-85nd9" podStartSLOduration=2.271383421 podStartE2EDuration="2.271383421s" podCreationTimestamp="2026-03-11 12:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:21.268677272 +0000 UTC m=+1127.859941239" watchObservedRunningTime="2026-03-11 12:17:21.271383421 +0000 UTC m=+1127.862647388" Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.342959 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-3c3c-account-create-update-2whdq" podStartSLOduration=2.34293672 podStartE2EDuration="2.34293672s" podCreationTimestamp="2026-03-11 12:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:21.334476084 +0000 UTC m=+1127.925740041" watchObservedRunningTime="2026-03-11 12:17:21.34293672 +0000 UTC m=+1127.934200687" Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.374236 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-9b21-account-create-update-r8vgg" podStartSLOduration=2.374210588 podStartE2EDuration="2.374210588s" podCreationTimestamp="2026-03-11 12:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:21.369636575 +0000 UTC m=+1127.960900552" watchObservedRunningTime="2026-03-11 12:17:21.374210588 +0000 UTC m=+1127.965474555" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.197200 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.256815 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw7f2\" (UniqueName: \"kubernetes.io/projected/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-kube-api-access-hw7f2\") pod \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.257007 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-dns-svc\") pod \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.257035 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-config\") pod \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.267367 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-kube-api-access-hw7f2" (OuterVolumeSpecName: "kube-api-access-hw7f2") pod "1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" (UID: "1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5"). InnerVolumeSpecName "kube-api-access-hw7f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.274441 4816 generic.go:334] "Generic (PLEG): container finished" podID="742cfc03-0365-4df8-a7f6-e6eac11ba045" containerID="d6e7d3be2f695e55ef6abf84c83d060683eae93e0020c12fe8744829cbcc1d6a" exitCode=0 Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.274853 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rmcqp" event={"ID":"742cfc03-0365-4df8-a7f6-e6eac11ba045","Type":"ContainerDied","Data":"d6e7d3be2f695e55ef6abf84c83d060683eae93e0020c12fe8744829cbcc1d6a"} Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.279379 4816 generic.go:334] "Generic (PLEG): container finished" podID="632c5d32-5370-401a-8202-58e0ec70f357" containerID="8c240088bce92d648a44cfc826778c591f6601fbc70cdbc9325a1348704e1a92" exitCode=0 Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.279429 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c3c-account-create-update-2whdq" event={"ID":"632c5d32-5370-401a-8202-58e0ec70f357","Type":"ContainerDied","Data":"8c240088bce92d648a44cfc826778c591f6601fbc70cdbc9325a1348704e1a92"} Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.281160 4816 generic.go:334] "Generic (PLEG): container finished" podID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerID="f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd" exitCode=0 Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.281290 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.281519 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" event={"ID":"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5","Type":"ContainerDied","Data":"f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd"} Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.281681 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" event={"ID":"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5","Type":"ContainerDied","Data":"c77801a330291e43dbd716f0eaa0018246a0046f3623f314fd24c49712949d82"} Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.281813 4816 scope.go:117] "RemoveContainer" containerID="f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.292118 4816 generic.go:334] "Generic (PLEG): container finished" podID="8d3e7fa1-3f66-495b-be44-cf97eec043c1" containerID="ab6525891e160f8b83901124157238a30564c85220f9440c25fb3222634839c7" exitCode=0 Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.292210 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-85nd9" event={"ID":"8d3e7fa1-3f66-495b-be44-cf97eec043c1","Type":"ContainerDied","Data":"ab6525891e160f8b83901124157238a30564c85220f9440c25fb3222634839c7"} Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.311423 4816 generic.go:334] "Generic (PLEG): container finished" podID="625b367b-084e-4cf8-8c30-5d4df9c696f9" containerID="d63f60636f6e53982a24004e405c34ed67500a9193f04b98e8d29856c8e89ee2" exitCode=0 Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.311760 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9b21-account-create-update-r8vgg" event={"ID":"625b367b-084e-4cf8-8c30-5d4df9c696f9","Type":"ContainerDied","Data":"d63f60636f6e53982a24004e405c34ed67500a9193f04b98e8d29856c8e89ee2"} Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.328060 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-config" (OuterVolumeSpecName: "config") pod "1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" (UID: "1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.328076 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" (UID: "1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.359558 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.359595 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.359608 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw7f2\" (UniqueName: \"kubernetes.io/projected/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-kube-api-access-hw7f2\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.383002 4816 scope.go:117] "RemoveContainer" containerID="c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.404863 4816 scope.go:117] "RemoveContainer" containerID="f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd" Mar 11 12:17:22 crc kubenswrapper[4816]: E0311 12:17:22.405367 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd\": container with ID starting with f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd not found: ID does not exist" containerID="f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.405401 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd"} err="failed to get container status \"f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd\": rpc error: code = NotFound desc = could not find container \"f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd\": container with ID starting with f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd not found: ID does not exist" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.405464 4816 scope.go:117] "RemoveContainer" containerID="c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e" Mar 11 12:17:22 crc kubenswrapper[4816]: E0311 12:17:22.406489 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e\": container with ID starting with c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e not found: ID does not exist" containerID="c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.406532 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e"} err="failed to get container status \"c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e\": rpc error: code = NotFound desc = could not find container \"c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e\": container with ID starting with c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e not found: ID does not exist" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.664033 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-l5lds" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.679300 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-ngbb2"] Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.694352 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-ngbb2"] Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.768215 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcqxt\" (UniqueName: \"kubernetes.io/projected/9fd32333-bdaa-461b-ac10-324291d1e5d3-kube-api-access-mcqxt\") pod \"9fd32333-bdaa-461b-ac10-324291d1e5d3\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.768623 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fd32333-bdaa-461b-ac10-324291d1e5d3-operator-scripts\") pod \"9fd32333-bdaa-461b-ac10-324291d1e5d3\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.772077 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fd32333-bdaa-461b-ac10-324291d1e5d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9fd32333-bdaa-461b-ac10-324291d1e5d3" (UID: "9fd32333-bdaa-461b-ac10-324291d1e5d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.778794 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fd32333-bdaa-461b-ac10-324291d1e5d3-kube-api-access-mcqxt" (OuterVolumeSpecName: "kube-api-access-mcqxt") pod "9fd32333-bdaa-461b-ac10-324291d1e5d3" (UID: "9fd32333-bdaa-461b-ac10-324291d1e5d3"). InnerVolumeSpecName "kube-api-access-mcqxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.820113 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.872748 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288dd774-6e04-45d2-b786-c7f2be7fbeae-operator-scripts\") pod \"288dd774-6e04-45d2-b786-c7f2be7fbeae\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.872935 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f9qn\" (UniqueName: \"kubernetes.io/projected/288dd774-6e04-45d2-b786-c7f2be7fbeae-kube-api-access-2f9qn\") pod \"288dd774-6e04-45d2-b786-c7f2be7fbeae\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.873438 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcqxt\" (UniqueName: \"kubernetes.io/projected/9fd32333-bdaa-461b-ac10-324291d1e5d3-kube-api-access-mcqxt\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.873458 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fd32333-bdaa-461b-ac10-324291d1e5d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.874178 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/288dd774-6e04-45d2-b786-c7f2be7fbeae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "288dd774-6e04-45d2-b786-c7f2be7fbeae" (UID: "288dd774-6e04-45d2-b786-c7f2be7fbeae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.878381 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/288dd774-6e04-45d2-b786-c7f2be7fbeae-kube-api-access-2f9qn" (OuterVolumeSpecName: "kube-api-access-2f9qn") pod "288dd774-6e04-45d2-b786-c7f2be7fbeae" (UID: "288dd774-6e04-45d2-b786-c7f2be7fbeae"). InnerVolumeSpecName "kube-api-access-2f9qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.976129 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f9qn\" (UniqueName: \"kubernetes.io/projected/288dd774-6e04-45d2-b786-c7f2be7fbeae-kube-api-access-2f9qn\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.976188 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288dd774-6e04-45d2-b786-c7f2be7fbeae-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.324865 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-l5lds" event={"ID":"9fd32333-bdaa-461b-ac10-324291d1e5d3","Type":"ContainerDied","Data":"3ceb41c8ea2fba7175551e2e2e287690c5e419936c11d2017fa5a383da9d61fd"} Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.324936 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ceb41c8ea2fba7175551e2e2e287690c5e419936c11d2017fa5a383da9d61fd" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.325030 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-l5lds" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.329828 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.329879 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c8d3-account-create-update-85zqd" event={"ID":"288dd774-6e04-45d2-b786-c7f2be7fbeae","Type":"ContainerDied","Data":"d08403da7f71b924e48ef9d0d5d10621dca1ae09e43b0f9bc4c8d1ae6cf47de1"} Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.329963 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d08403da7f71b924e48ef9d0d5d10621dca1ae09e43b0f9bc4c8d1ae6cf47de1" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.671503 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.822002 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.884729 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.906711 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bccjt\" (UniqueName: \"kubernetes.io/projected/742cfc03-0365-4df8-a7f6-e6eac11ba045-kube-api-access-bccjt\") pod \"742cfc03-0365-4df8-a7f6-e6eac11ba045\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.907047 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/742cfc03-0365-4df8-a7f6-e6eac11ba045-operator-scripts\") pod \"742cfc03-0365-4df8-a7f6-e6eac11ba045\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.907692 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/742cfc03-0365-4df8-a7f6-e6eac11ba045-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "742cfc03-0365-4df8-a7f6-e6eac11ba045" (UID: "742cfc03-0365-4df8-a7f6-e6eac11ba045"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.908735 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/742cfc03-0365-4df8-a7f6-e6eac11ba045-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.913498 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/742cfc03-0365-4df8-a7f6-e6eac11ba045-kube-api-access-bccjt" (OuterVolumeSpecName: "kube-api-access-bccjt") pod "742cfc03-0365-4df8-a7f6-e6eac11ba045" (UID: "742cfc03-0365-4df8-a7f6-e6eac11ba045"). InnerVolumeSpecName "kube-api-access-bccjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.928688 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-85nd9" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.932696 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.010002 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk5p5\" (UniqueName: \"kubernetes.io/projected/632c5d32-5370-401a-8202-58e0ec70f357-kube-api-access-gk5p5\") pod \"632c5d32-5370-401a-8202-58e0ec70f357\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.010176 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2mbc\" (UniqueName: \"kubernetes.io/projected/625b367b-084e-4cf8-8c30-5d4df9c696f9-kube-api-access-c2mbc\") pod \"625b367b-084e-4cf8-8c30-5d4df9c696f9\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.010292 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625b367b-084e-4cf8-8c30-5d4df9c696f9-operator-scripts\") pod \"625b367b-084e-4cf8-8c30-5d4df9c696f9\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.010343 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4p6b\" (UniqueName: \"kubernetes.io/projected/8d3e7fa1-3f66-495b-be44-cf97eec043c1-kube-api-access-x4p6b\") pod \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.010378 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d3e7fa1-3f66-495b-be44-cf97eec043c1-operator-scripts\") pod \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.010412 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/632c5d32-5370-401a-8202-58e0ec70f357-operator-scripts\") pod \"632c5d32-5370-401a-8202-58e0ec70f357\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.010774 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bccjt\" (UniqueName: \"kubernetes.io/projected/742cfc03-0365-4df8-a7f6-e6eac11ba045-kube-api-access-bccjt\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.011309 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625b367b-084e-4cf8-8c30-5d4df9c696f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "625b367b-084e-4cf8-8c30-5d4df9c696f9" (UID: "625b367b-084e-4cf8-8c30-5d4df9c696f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.011515 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/632c5d32-5370-401a-8202-58e0ec70f357-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "632c5d32-5370-401a-8202-58e0ec70f357" (UID: "632c5d32-5370-401a-8202-58e0ec70f357"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.011728 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d3e7fa1-3f66-495b-be44-cf97eec043c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d3e7fa1-3f66-495b-be44-cf97eec043c1" (UID: "8d3e7fa1-3f66-495b-be44-cf97eec043c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.014598 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/625b367b-084e-4cf8-8c30-5d4df9c696f9-kube-api-access-c2mbc" (OuterVolumeSpecName: "kube-api-access-c2mbc") pod "625b367b-084e-4cf8-8c30-5d4df9c696f9" (UID: "625b367b-084e-4cf8-8c30-5d4df9c696f9"). InnerVolumeSpecName "kube-api-access-c2mbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.014671 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632c5d32-5370-401a-8202-58e0ec70f357-kube-api-access-gk5p5" (OuterVolumeSpecName: "kube-api-access-gk5p5") pod "632c5d32-5370-401a-8202-58e0ec70f357" (UID: "632c5d32-5370-401a-8202-58e0ec70f357"). InnerVolumeSpecName "kube-api-access-gk5p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.015121 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d3e7fa1-3f66-495b-be44-cf97eec043c1-kube-api-access-x4p6b" (OuterVolumeSpecName: "kube-api-access-x4p6b") pod "8d3e7fa1-3f66-495b-be44-cf97eec043c1" (UID: "8d3e7fa1-3f66-495b-be44-cf97eec043c1"). InnerVolumeSpecName "kube-api-access-x4p6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.112865 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625b367b-084e-4cf8-8c30-5d4df9c696f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.112928 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4p6b\" (UniqueName: \"kubernetes.io/projected/8d3e7fa1-3f66-495b-be44-cf97eec043c1-kube-api-access-x4p6b\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.112946 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d3e7fa1-3f66-495b-be44-cf97eec043c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.112959 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/632c5d32-5370-401a-8202-58e0ec70f357-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.112973 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk5p5\" (UniqueName: \"kubernetes.io/projected/632c5d32-5370-401a-8202-58e0ec70f357-kube-api-access-gk5p5\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.112986 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2mbc\" (UniqueName: \"kubernetes.io/projected/625b367b-084e-4cf8-8c30-5d4df9c696f9-kube-api-access-c2mbc\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.180664 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" path="/var/lib/kubelet/pods/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5/volumes" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.272921 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-n98v5"] Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273532 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd32333-bdaa-461b-ac10-324291d1e5d3" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273562 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd32333-bdaa-461b-ac10-324291d1e5d3" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273591 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625b367b-084e-4cf8-8c30-5d4df9c696f9" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273602 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="625b367b-084e-4cf8-8c30-5d4df9c696f9" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273617 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerName="init" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273629 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerName="init" Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273655 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d3e7fa1-3f66-495b-be44-cf97eec043c1" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273663 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3e7fa1-3f66-495b-be44-cf97eec043c1" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273675 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerName="dnsmasq-dns" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273685 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerName="dnsmasq-dns" Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273702 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632c5d32-5370-401a-8202-58e0ec70f357" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273710 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="632c5d32-5370-401a-8202-58e0ec70f357" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273721 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="742cfc03-0365-4df8-a7f6-e6eac11ba045" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273731 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="742cfc03-0365-4df8-a7f6-e6eac11ba045" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273750 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="288dd774-6e04-45d2-b786-c7f2be7fbeae" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273759 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="288dd774-6e04-45d2-b786-c7f2be7fbeae" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273986 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d3e7fa1-3f66-495b-be44-cf97eec043c1" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.274014 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="625b367b-084e-4cf8-8c30-5d4df9c696f9" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.274028 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="632c5d32-5370-401a-8202-58e0ec70f357" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.274039 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="288dd774-6e04-45d2-b786-c7f2be7fbeae" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.274051 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd32333-bdaa-461b-ac10-324291d1e5d3" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.274063 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerName="dnsmasq-dns" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.274074 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="742cfc03-0365-4df8-a7f6-e6eac11ba045" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.275379 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.278768 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-22dm7" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.279893 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.295957 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-n98v5"] Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.338832 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rmcqp" event={"ID":"742cfc03-0365-4df8-a7f6-e6eac11ba045","Type":"ContainerDied","Data":"72f8ee81a2c1316c277ebe03c1377b981929e7b088dea5cad65f5821f4e7a02b"} Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.338891 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72f8ee81a2c1316c277ebe03c1377b981929e7b088dea5cad65f5821f4e7a02b" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.338889 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.340746 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c3c-account-create-update-2whdq" event={"ID":"632c5d32-5370-401a-8202-58e0ec70f357","Type":"ContainerDied","Data":"b82ed4a76542786e4b16c41e7a02f7bd83269f45c60c92a6fba6442f53946ac7"} Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.340791 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.340808 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b82ed4a76542786e4b16c41e7a02f7bd83269f45c60c92a6fba6442f53946ac7" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.342364 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-85nd9" event={"ID":"8d3e7fa1-3f66-495b-be44-cf97eec043c1","Type":"ContainerDied","Data":"325c8645905fb2092da099ab41a3a90f525a5fd495df91580bbe9c8dfd427b77"} Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.342418 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="325c8645905fb2092da099ab41a3a90f525a5fd495df91580bbe9c8dfd427b77" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.342530 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-85nd9" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.348371 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9b21-account-create-update-r8vgg" event={"ID":"625b367b-084e-4cf8-8c30-5d4df9c696f9","Type":"ContainerDied","Data":"b1d1b3a696ed28345a8a58187f340852c23ee5c0f7a319f65f9096cd2efaa080"} Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.348424 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1d1b3a696ed28345a8a58187f340852c23ee5c0f7a319f65f9096cd2efaa080" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.348475 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.419955 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pghjx\" (UniqueName: \"kubernetes.io/projected/b6745bae-b403-4a86-9148-8baecc00f8b1-kube-api-access-pghjx\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.420103 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-db-sync-config-data\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.420138 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-config-data\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.420166 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-combined-ca-bundle\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.521527 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pghjx\" (UniqueName: \"kubernetes.io/projected/b6745bae-b403-4a86-9148-8baecc00f8b1-kube-api-access-pghjx\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.521678 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-db-sync-config-data\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.521714 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-config-data\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.521747 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-combined-ca-bundle\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.527212 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-db-sync-config-data\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.527557 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-combined-ca-bundle\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.528232 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-config-data\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.539478 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pghjx\" (UniqueName: \"kubernetes.io/projected/b6745bae-b403-4a86-9148-8baecc00f8b1-kube-api-access-pghjx\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.596881 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.024290 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-n98v5"] Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.358579 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n98v5" event={"ID":"b6745bae-b403-4a86-9148-8baecc00f8b1","Type":"ContainerStarted","Data":"7ade065f6f708de586323f677e56810ada0b99da337e5a079b57da2cc0b0b5a0"} Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.571570 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9mq9q"] Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.572789 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.575435 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.585307 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9mq9q"] Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.642860 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2da861f5-2cc3-402f-aca5-afbce135baaa-operator-scripts\") pod \"root-account-create-update-9mq9q\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.643076 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnkj8\" (UniqueName: \"kubernetes.io/projected/2da861f5-2cc3-402f-aca5-afbce135baaa-kube-api-access-qnkj8\") pod \"root-account-create-update-9mq9q\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.744422 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnkj8\" (UniqueName: \"kubernetes.io/projected/2da861f5-2cc3-402f-aca5-afbce135baaa-kube-api-access-qnkj8\") pod \"root-account-create-update-9mq9q\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.744587 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2da861f5-2cc3-402f-aca5-afbce135baaa-operator-scripts\") pod \"root-account-create-update-9mq9q\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.745734 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2da861f5-2cc3-402f-aca5-afbce135baaa-operator-scripts\") pod \"root-account-create-update-9mq9q\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.774225 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnkj8\" (UniqueName: \"kubernetes.io/projected/2da861f5-2cc3-402f-aca5-afbce135baaa-kube-api-access-qnkj8\") pod \"root-account-create-update-9mq9q\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.898476 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:26 crc kubenswrapper[4816]: I0311 12:17:26.385017 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fca72cd-9caa-4029-8c20-1623a315702d" containerID="c460fb14090c9d550203cf386e04b06e3563514702df072e42ad5fc80f7e1872" exitCode=0 Mar 11 12:17:26 crc kubenswrapper[4816]: I0311 12:17:26.385085 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9nggr" event={"ID":"7fca72cd-9caa-4029-8c20-1623a315702d","Type":"ContainerDied","Data":"c460fb14090c9d550203cf386e04b06e3563514702df072e42ad5fc80f7e1872"} Mar 11 12:17:26 crc kubenswrapper[4816]: I0311 12:17:26.436461 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9mq9q"] Mar 11 12:17:26 crc kubenswrapper[4816]: W0311 12:17:26.442078 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2da861f5_2cc3_402f_aca5_afbce135baaa.slice/crio-4becd7c4791fdd33ec56c96adc3b97cb604d2b1821c1e588fbfa3b9a0ae5597b WatchSource:0}: Error finding container 4becd7c4791fdd33ec56c96adc3b97cb604d2b1821c1e588fbfa3b9a0ae5597b: Status 404 returned error can't find the container with id 4becd7c4791fdd33ec56c96adc3b97cb604d2b1821c1e588fbfa3b9a0ae5597b Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.397071 4816 generic.go:334] "Generic (PLEG): container finished" podID="2da861f5-2cc3-402f-aca5-afbce135baaa" containerID="0de73c3da519dc3d23fdd410a58406f0ff5aec8f4b5e6483b5c4a546f3b60ef0" exitCode=0 Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.397626 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9mq9q" event={"ID":"2da861f5-2cc3-402f-aca5-afbce135baaa","Type":"ContainerDied","Data":"0de73c3da519dc3d23fdd410a58406f0ff5aec8f4b5e6483b5c4a546f3b60ef0"} Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.397667 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9mq9q" event={"ID":"2da861f5-2cc3-402f-aca5-afbce135baaa","Type":"ContainerStarted","Data":"4becd7c4791fdd33ec56c96adc3b97cb604d2b1821c1e588fbfa3b9a0ae5597b"} Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.703424 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.712271 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.766112 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.820958 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.906591 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7fca72cd-9caa-4029-8c20-1623a315702d-etc-swift\") pod \"7fca72cd-9caa-4029-8c20-1623a315702d\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.906670 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-scripts\") pod \"7fca72cd-9caa-4029-8c20-1623a315702d\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.907004 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-combined-ca-bundle\") pod \"7fca72cd-9caa-4029-8c20-1623a315702d\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.907081 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-swiftconf\") pod \"7fca72cd-9caa-4029-8c20-1623a315702d\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.907129 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-dispersionconf\") pod \"7fca72cd-9caa-4029-8c20-1623a315702d\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.908365 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvz42\" (UniqueName: \"kubernetes.io/projected/7fca72cd-9caa-4029-8c20-1623a315702d-kube-api-access-kvz42\") pod \"7fca72cd-9caa-4029-8c20-1623a315702d\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.908412 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-ring-data-devices\") pod \"7fca72cd-9caa-4029-8c20-1623a315702d\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.908914 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fca72cd-9caa-4029-8c20-1623a315702d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7fca72cd-9caa-4029-8c20-1623a315702d" (UID: "7fca72cd-9caa-4029-8c20-1623a315702d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.910490 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7fca72cd-9caa-4029-8c20-1623a315702d" (UID: "7fca72cd-9caa-4029-8c20-1623a315702d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.919045 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fca72cd-9caa-4029-8c20-1623a315702d-kube-api-access-kvz42" (OuterVolumeSpecName: "kube-api-access-kvz42") pod "7fca72cd-9caa-4029-8c20-1623a315702d" (UID: "7fca72cd-9caa-4029-8c20-1623a315702d"). InnerVolumeSpecName "kube-api-access-kvz42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.922087 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7fca72cd-9caa-4029-8c20-1623a315702d" (UID: "7fca72cd-9caa-4029-8c20-1623a315702d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.942795 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7fca72cd-9caa-4029-8c20-1623a315702d" (UID: "7fca72cd-9caa-4029-8c20-1623a315702d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.944004 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-scripts" (OuterVolumeSpecName: "scripts") pod "7fca72cd-9caa-4029-8c20-1623a315702d" (UID: "7fca72cd-9caa-4029-8c20-1623a315702d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.945544 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fca72cd-9caa-4029-8c20-1623a315702d" (UID: "7fca72cd-9caa-4029-8c20-1623a315702d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.010592 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvz42\" (UniqueName: \"kubernetes.io/projected/7fca72cd-9caa-4029-8c20-1623a315702d-kube-api-access-kvz42\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.010632 4816 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.010642 4816 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7fca72cd-9caa-4029-8c20-1623a315702d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.010651 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.010661 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.010669 4816 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.010677 4816 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.368238 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 11 12:17:28 crc kubenswrapper[4816]: W0311 12:17:28.379461 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod485f9fbd_e0ca_472d_b97c_87c127253a96.slice/crio-bc1ccba63ef105a914d68b8eed3c206cfda92b47e6236ce5828d528e3ceb9770 WatchSource:0}: Error finding container bc1ccba63ef105a914d68b8eed3c206cfda92b47e6236ce5828d528e3ceb9770: Status 404 returned error can't find the container with id bc1ccba63ef105a914d68b8eed3c206cfda92b47e6236ce5828d528e3ceb9770 Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.411521 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9nggr" event={"ID":"7fca72cd-9caa-4029-8c20-1623a315702d","Type":"ContainerDied","Data":"33b17ac615d74325a9263091c6d521ccba5681421913cd3808e6a592677fe4c5"} Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.411563 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33b17ac615d74325a9263091c6d521ccba5681421913cd3808e6a592677fe4c5" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.411574 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.413894 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"bc1ccba63ef105a914d68b8eed3c206cfda92b47e6236ce5828d528e3ceb9770"} Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.747577 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.830462 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2da861f5-2cc3-402f-aca5-afbce135baaa-operator-scripts\") pod \"2da861f5-2cc3-402f-aca5-afbce135baaa\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.830771 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnkj8\" (UniqueName: \"kubernetes.io/projected/2da861f5-2cc3-402f-aca5-afbce135baaa-kube-api-access-qnkj8\") pod \"2da861f5-2cc3-402f-aca5-afbce135baaa\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.831302 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2da861f5-2cc3-402f-aca5-afbce135baaa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2da861f5-2cc3-402f-aca5-afbce135baaa" (UID: "2da861f5-2cc3-402f-aca5-afbce135baaa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.836565 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da861f5-2cc3-402f-aca5-afbce135baaa-kube-api-access-qnkj8" (OuterVolumeSpecName: "kube-api-access-qnkj8") pod "2da861f5-2cc3-402f-aca5-afbce135baaa" (UID: "2da861f5-2cc3-402f-aca5-afbce135baaa"). InnerVolumeSpecName "kube-api-access-qnkj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.935341 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnkj8\" (UniqueName: \"kubernetes.io/projected/2da861f5-2cc3-402f-aca5-afbce135baaa-kube-api-access-qnkj8\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.935384 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2da861f5-2cc3-402f-aca5-afbce135baaa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:29 crc kubenswrapper[4816]: I0311 12:17:29.426774 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9mq9q" event={"ID":"2da861f5-2cc3-402f-aca5-afbce135baaa","Type":"ContainerDied","Data":"4becd7c4791fdd33ec56c96adc3b97cb604d2b1821c1e588fbfa3b9a0ae5597b"} Mar 11 12:17:29 crc kubenswrapper[4816]: I0311 12:17:29.426828 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4becd7c4791fdd33ec56c96adc3b97cb604d2b1821c1e588fbfa3b9a0ae5597b" Mar 11 12:17:29 crc kubenswrapper[4816]: I0311 12:17:29.426858 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:30 crc kubenswrapper[4816]: I0311 12:17:30.441855 4816 generic.go:334] "Generic (PLEG): container finished" podID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerID="522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e" exitCode=0 Mar 11 12:17:30 crc kubenswrapper[4816]: I0311 12:17:30.441931 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3779c0f5-9084-4c07-83d9-fe2017559f7b","Type":"ContainerDied","Data":"522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e"} Mar 11 12:17:32 crc kubenswrapper[4816]: I0311 12:17:32.039599 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9mq9q"] Mar 11 12:17:32 crc kubenswrapper[4816]: I0311 12:17:32.048110 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9mq9q"] Mar 11 12:17:32 crc kubenswrapper[4816]: I0311 12:17:32.140029 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da861f5-2cc3-402f-aca5-afbce135baaa" path="/var/lib/kubelet/pods/2da861f5-2cc3-402f-aca5-afbce135baaa/volumes" Mar 11 12:17:32 crc kubenswrapper[4816]: I0311 12:17:32.458077 4816 generic.go:334] "Generic (PLEG): container finished" podID="26aea2df-f497-478d-b953-060189ef2569" containerID="47287b2bd213321105c729d451b069f02c0e309af3b5c9c84b7b9c24acc1a5f3" exitCode=0 Mar 11 12:17:32 crc kubenswrapper[4816]: I0311 12:17:32.458132 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26aea2df-f497-478d-b953-060189ef2569","Type":"ContainerDied","Data":"47287b2bd213321105c729d451b069f02c0e309af3b5c9c84b7b9c24acc1a5f3"} Mar 11 12:17:33 crc kubenswrapper[4816]: I0311 12:17:33.748343 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 11 12:17:34 crc kubenswrapper[4816]: I0311 12:17:34.562644 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-84rn8" podUID="2de58390-335b-40cc-8461-d931d3b22e41" containerName="ovn-controller" probeResult="failure" output=< Mar 11 12:17:34 crc kubenswrapper[4816]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 11 12:17:34 crc kubenswrapper[4816]: > Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.058414 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7l6hp"] Mar 11 12:17:37 crc kubenswrapper[4816]: E0311 12:17:37.059031 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da861f5-2cc3-402f-aca5-afbce135baaa" containerName="mariadb-account-create-update" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.059047 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da861f5-2cc3-402f-aca5-afbce135baaa" containerName="mariadb-account-create-update" Mar 11 12:17:37 crc kubenswrapper[4816]: E0311 12:17:37.059077 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fca72cd-9caa-4029-8c20-1623a315702d" containerName="swift-ring-rebalance" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.059086 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fca72cd-9caa-4029-8c20-1623a315702d" containerName="swift-ring-rebalance" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.059305 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fca72cd-9caa-4029-8c20-1623a315702d" containerName="swift-ring-rebalance" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.059323 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da861f5-2cc3-402f-aca5-afbce135baaa" containerName="mariadb-account-create-update" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.060045 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.065138 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.070931 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7l6hp"] Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.120906 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee11077d-39aa-44c4-9cf3-a8a80647bc50-operator-scripts\") pod \"root-account-create-update-7l6hp\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.121046 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw62s\" (UniqueName: \"kubernetes.io/projected/ee11077d-39aa-44c4-9cf3-a8a80647bc50-kube-api-access-mw62s\") pod \"root-account-create-update-7l6hp\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.224873 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee11077d-39aa-44c4-9cf3-a8a80647bc50-operator-scripts\") pod \"root-account-create-update-7l6hp\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.224926 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw62s\" (UniqueName: \"kubernetes.io/projected/ee11077d-39aa-44c4-9cf3-a8a80647bc50-kube-api-access-mw62s\") pod \"root-account-create-update-7l6hp\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.225944 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee11077d-39aa-44c4-9cf3-a8a80647bc50-operator-scripts\") pod \"root-account-create-update-7l6hp\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.259985 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw62s\" (UniqueName: \"kubernetes.io/projected/ee11077d-39aa-44c4-9cf3-a8a80647bc50-kube-api-access-mw62s\") pod \"root-account-create-update-7l6hp\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.385227 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.368560 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.370517 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.556158 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-84rn8" podUID="2de58390-335b-40cc-8461-d931d3b22e41" containerName="ovn-controller" probeResult="failure" output=< Mar 11 12:17:39 crc kubenswrapper[4816]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 11 12:17:39 crc kubenswrapper[4816]: > Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.620103 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-84rn8-config-gh5kb"] Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.621303 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.623055 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.648983 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-84rn8-config-gh5kb"] Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.670123 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.670212 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-log-ovn\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.670270 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-scripts\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.670343 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run-ovn\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.670369 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppkst\" (UniqueName: \"kubernetes.io/projected/eb02a85f-cc39-4119-a962-4b4fd66c015d-kube-api-access-ppkst\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.670460 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-additional-scripts\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772029 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772100 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-log-ovn\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772130 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-scripts\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772225 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run-ovn\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772279 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppkst\" (UniqueName: \"kubernetes.io/projected/eb02a85f-cc39-4119-a962-4b4fd66c015d-kube-api-access-ppkst\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772666 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772728 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-log-ovn\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772737 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run-ovn\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772941 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-additional-scripts\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.773778 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-additional-scripts\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.774212 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-scripts\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.794328 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppkst\" (UniqueName: \"kubernetes.io/projected/eb02a85f-cc39-4119-a962-4b4fd66c015d-kube-api-access-ppkst\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.968283 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:43 crc kubenswrapper[4816]: E0311 12:17:43.241523 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:dae5e39780d5a15eed030c7009f8e5317139d447558ac83f038497be594be120" Mar 11 12:17:43 crc kubenswrapper[4816]: E0311 12:17:43.242118 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:dae5e39780d5a15eed030c7009f8e5317139d447558ac83f038497be594be120,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pghjx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-n98v5_openstack(b6745bae-b403-4a86-9148-8baecc00f8b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:17:43 crc kubenswrapper[4816]: E0311 12:17:43.243778 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-n98v5" podUID="b6745bae-b403-4a86-9148-8baecc00f8b1" Mar 11 12:17:43 crc kubenswrapper[4816]: I0311 12:17:43.554791 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3779c0f5-9084-4c07-83d9-fe2017559f7b","Type":"ContainerStarted","Data":"18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f"} Mar 11 12:17:43 crc kubenswrapper[4816]: I0311 12:17:43.555584 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:17:43 crc kubenswrapper[4816]: E0311 12:17:43.556791 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:dae5e39780d5a15eed030c7009f8e5317139d447558ac83f038497be594be120\\\"\"" pod="openstack/glance-db-sync-n98v5" podUID="b6745bae-b403-4a86-9148-8baecc00f8b1" Mar 11 12:17:43 crc kubenswrapper[4816]: I0311 12:17:43.592419 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.005836596 podStartE2EDuration="1m9.592405472s" podCreationTimestamp="2026-03-11 12:16:34 +0000 UTC" firstStartedPulling="2026-03-11 12:16:36.16715988 +0000 UTC m=+1082.758423847" lastFinishedPulling="2026-03-11 12:16:55.753728756 +0000 UTC m=+1102.344992723" observedRunningTime="2026-03-11 12:17:43.590555118 +0000 UTC m=+1150.181819085" watchObservedRunningTime="2026-03-11 12:17:43.592405472 +0000 UTC m=+1150.183669429" Mar 11 12:17:43 crc kubenswrapper[4816]: I0311 12:17:43.819115 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-84rn8-config-gh5kb"] Mar 11 12:17:43 crc kubenswrapper[4816]: W0311 12:17:43.825937 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb02a85f_cc39_4119_a962_4b4fd66c015d.slice/crio-442c4ad71e60ee1fc52e7396b8d83c71aa66e6183cf09a9aee4eec94596dee35 WatchSource:0}: Error finding container 442c4ad71e60ee1fc52e7396b8d83c71aa66e6183cf09a9aee4eec94596dee35: Status 404 returned error can't find the container with id 442c4ad71e60ee1fc52e7396b8d83c71aa66e6183cf09a9aee4eec94596dee35 Mar 11 12:17:43 crc kubenswrapper[4816]: I0311 12:17:43.963777 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7l6hp"] Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.552781 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-84rn8" Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.564715 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8-config-gh5kb" event={"ID":"eb02a85f-cc39-4119-a962-4b4fd66c015d","Type":"ContainerStarted","Data":"c9628d19e1e9c78361e9677b8afa40ad86295ad47aa0110e4b51ead3233c90ca"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.564760 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8-config-gh5kb" event={"ID":"eb02a85f-cc39-4119-a962-4b4fd66c015d","Type":"ContainerStarted","Data":"442c4ad71e60ee1fc52e7396b8d83c71aa66e6183cf09a9aee4eec94596dee35"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.572474 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.572550 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.572566 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.572577 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.574163 4816 generic.go:334] "Generic (PLEG): container finished" podID="ee11077d-39aa-44c4-9cf3-a8a80647bc50" containerID="3dfe4dd28e66c33830345db1226180f842f3ae3d59f4fa3a4c553af39dd07c67" exitCode=0 Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.574235 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7l6hp" event={"ID":"ee11077d-39aa-44c4-9cf3-a8a80647bc50","Type":"ContainerDied","Data":"3dfe4dd28e66c33830345db1226180f842f3ae3d59f4fa3a4c553af39dd07c67"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.574369 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7l6hp" event={"ID":"ee11077d-39aa-44c4-9cf3-a8a80647bc50","Type":"ContainerStarted","Data":"8a5daaf8f9d67a350db46e8d3c4baf6ef0a1efc9cf1a945115cf84cf82ad6093"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.581098 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26aea2df-f497-478d-b953-060189ef2569","Type":"ContainerStarted","Data":"0735cf7e4268f5297289dcfc433ce805028b2098230211ba63ceb121fac25ec7"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.581576 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.603542 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-84rn8-config-gh5kb" podStartSLOduration=5.603525045 podStartE2EDuration="5.603525045s" podCreationTimestamp="2026-03-11 12:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:44.597330325 +0000 UTC m=+1151.188594292" watchObservedRunningTime="2026-03-11 12:17:44.603525045 +0000 UTC m=+1151.194789012" Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.645580 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=70.645559727 podStartE2EDuration="1m10.645559727s" podCreationTimestamp="2026-03-11 12:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:44.640578092 +0000 UTC m=+1151.231842049" watchObservedRunningTime="2026-03-11 12:17:44.645559727 +0000 UTC m=+1151.236823694" Mar 11 12:17:45 crc kubenswrapper[4816]: I0311 12:17:45.596028 4816 generic.go:334] "Generic (PLEG): container finished" podID="eb02a85f-cc39-4119-a962-4b4fd66c015d" containerID="c9628d19e1e9c78361e9677b8afa40ad86295ad47aa0110e4b51ead3233c90ca" exitCode=0 Mar 11 12:17:45 crc kubenswrapper[4816]: I0311 12:17:45.596311 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8-config-gh5kb" event={"ID":"eb02a85f-cc39-4119-a962-4b4fd66c015d","Type":"ContainerDied","Data":"c9628d19e1e9c78361e9677b8afa40ad86295ad47aa0110e4b51ead3233c90ca"} Mar 11 12:17:45 crc kubenswrapper[4816]: I0311 12:17:45.953865 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.130046 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee11077d-39aa-44c4-9cf3-a8a80647bc50-operator-scripts\") pod \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.130657 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee11077d-39aa-44c4-9cf3-a8a80647bc50-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee11077d-39aa-44c4-9cf3-a8a80647bc50" (UID: "ee11077d-39aa-44c4-9cf3-a8a80647bc50"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.130861 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw62s\" (UniqueName: \"kubernetes.io/projected/ee11077d-39aa-44c4-9cf3-a8a80647bc50-kube-api-access-mw62s\") pod \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.131899 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee11077d-39aa-44c4-9cf3-a8a80647bc50-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.136110 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee11077d-39aa-44c4-9cf3-a8a80647bc50-kube-api-access-mw62s" (OuterVolumeSpecName: "kube-api-access-mw62s") pod "ee11077d-39aa-44c4-9cf3-a8a80647bc50" (UID: "ee11077d-39aa-44c4-9cf3-a8a80647bc50"). InnerVolumeSpecName "kube-api-access-mw62s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.235545 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw62s\" (UniqueName: \"kubernetes.io/projected/ee11077d-39aa-44c4-9cf3-a8a80647bc50-kube-api-access-mw62s\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.646556 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190"} Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.646871 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1"} Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.646883 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf"} Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.646894 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000"} Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.653000 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7l6hp" event={"ID":"ee11077d-39aa-44c4-9cf3-a8a80647bc50","Type":"ContainerDied","Data":"8a5daaf8f9d67a350db46e8d3c4baf6ef0a1efc9cf1a945115cf84cf82ad6093"} Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.653077 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a5daaf8f9d67a350db46e8d3c4baf6ef0a1efc9cf1a945115cf84cf82ad6093" Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.653180 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.018979 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.152370 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-scripts\") pod \"eb02a85f-cc39-4119-a962-4b4fd66c015d\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.152917 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run\") pod \"eb02a85f-cc39-4119-a962-4b4fd66c015d\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.152980 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-log-ovn\") pod \"eb02a85f-cc39-4119-a962-4b4fd66c015d\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153010 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run-ovn\") pod \"eb02a85f-cc39-4119-a962-4b4fd66c015d\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153016 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run" (OuterVolumeSpecName: "var-run") pod "eb02a85f-cc39-4119-a962-4b4fd66c015d" (UID: "eb02a85f-cc39-4119-a962-4b4fd66c015d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153067 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppkst\" (UniqueName: \"kubernetes.io/projected/eb02a85f-cc39-4119-a962-4b4fd66c015d-kube-api-access-ppkst\") pod \"eb02a85f-cc39-4119-a962-4b4fd66c015d\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153079 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "eb02a85f-cc39-4119-a962-4b4fd66c015d" (UID: "eb02a85f-cc39-4119-a962-4b4fd66c015d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153106 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-additional-scripts\") pod \"eb02a85f-cc39-4119-a962-4b4fd66c015d\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153092 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "eb02a85f-cc39-4119-a962-4b4fd66c015d" (UID: "eb02a85f-cc39-4119-a962-4b4fd66c015d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153632 4816 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153657 4816 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153672 4816 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153769 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "eb02a85f-cc39-4119-a962-4b4fd66c015d" (UID: "eb02a85f-cc39-4119-a962-4b4fd66c015d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153909 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-scripts" (OuterVolumeSpecName: "scripts") pod "eb02a85f-cc39-4119-a962-4b4fd66c015d" (UID: "eb02a85f-cc39-4119-a962-4b4fd66c015d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.159652 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb02a85f-cc39-4119-a962-4b4fd66c015d-kube-api-access-ppkst" (OuterVolumeSpecName: "kube-api-access-ppkst") pod "eb02a85f-cc39-4119-a962-4b4fd66c015d" (UID: "eb02a85f-cc39-4119-a962-4b4fd66c015d"). InnerVolumeSpecName "kube-api-access-ppkst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.254920 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppkst\" (UniqueName: \"kubernetes.io/projected/eb02a85f-cc39-4119-a962-4b4fd66c015d-kube-api-access-ppkst\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.254958 4816 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.254967 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.662229 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8-config-gh5kb" event={"ID":"eb02a85f-cc39-4119-a962-4b4fd66c015d","Type":"ContainerDied","Data":"442c4ad71e60ee1fc52e7396b8d83c71aa66e6183cf09a9aee4eec94596dee35"} Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.662289 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="442c4ad71e60ee1fc52e7396b8d83c71aa66e6183cf09a9aee4eec94596dee35" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.662333 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:48 crc kubenswrapper[4816]: I0311 12:17:48.194094 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-84rn8-config-gh5kb"] Mar 11 12:17:48 crc kubenswrapper[4816]: I0311 12:17:48.204960 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-84rn8-config-gh5kb"] Mar 11 12:17:48 crc kubenswrapper[4816]: I0311 12:17:48.687871 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477"} Mar 11 12:17:48 crc kubenswrapper[4816]: I0311 12:17:48.687911 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4"} Mar 11 12:17:48 crc kubenswrapper[4816]: I0311 12:17:48.687921 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db"} Mar 11 12:17:49 crc kubenswrapper[4816]: I0311 12:17:49.704323 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61"} Mar 11 12:17:49 crc kubenswrapper[4816]: I0311 12:17:49.704683 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e"} Mar 11 12:17:49 crc kubenswrapper[4816]: I0311 12:17:49.704705 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2"} Mar 11 12:17:49 crc kubenswrapper[4816]: I0311 12:17:49.704720 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd"} Mar 11 12:17:49 crc kubenswrapper[4816]: I0311 12:17:49.749102 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.290262902 podStartE2EDuration="39.749083585s" podCreationTimestamp="2026-03-11 12:17:10 +0000 UTC" firstStartedPulling="2026-03-11 12:17:28.382708007 +0000 UTC m=+1134.973971974" lastFinishedPulling="2026-03-11 12:17:47.84152869 +0000 UTC m=+1154.432792657" observedRunningTime="2026-03-11 12:17:49.74789168 +0000 UTC m=+1156.339155647" watchObservedRunningTime="2026-03-11 12:17:49.749083585 +0000 UTC m=+1156.340347552" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.055735 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67754df655-hhz24"] Mar 11 12:17:50 crc kubenswrapper[4816]: E0311 12:17:50.056657 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb02a85f-cc39-4119-a962-4b4fd66c015d" containerName="ovn-config" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.056738 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb02a85f-cc39-4119-a962-4b4fd66c015d" containerName="ovn-config" Mar 11 12:17:50 crc kubenswrapper[4816]: E0311 12:17:50.056816 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee11077d-39aa-44c4-9cf3-a8a80647bc50" containerName="mariadb-account-create-update" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.056874 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee11077d-39aa-44c4-9cf3-a8a80647bc50" containerName="mariadb-account-create-update" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.057129 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee11077d-39aa-44c4-9cf3-a8a80647bc50" containerName="mariadb-account-create-update" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.057214 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb02a85f-cc39-4119-a962-4b4fd66c015d" containerName="ovn-config" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.058622 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.067372 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.084010 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67754df655-hhz24"] Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.141986 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb02a85f-cc39-4119-a962-4b4fd66c015d" path="/var/lib/kubelet/pods/eb02a85f-cc39-4119-a962-4b4fd66c015d/volumes" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.208234 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-sb\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.208310 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-swift-storage-0\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.208351 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-config\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.208374 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-svc\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.208394 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9csz\" (UniqueName: \"kubernetes.io/projected/79c46d79-aa47-428c-abec-a6f94c66e9ab-kube-api-access-g9csz\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.208489 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-nb\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.310362 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-sb\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.310760 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-swift-storage-0\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.310890 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-config\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.311100 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-svc\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.311211 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9csz\" (UniqueName: \"kubernetes.io/projected/79c46d79-aa47-428c-abec-a6f94c66e9ab-kube-api-access-g9csz\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.311476 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-nb\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.312624 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-nb\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.313742 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-sb\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.314543 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-swift-storage-0\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.315129 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-config\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.315750 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-svc\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.336902 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9csz\" (UniqueName: \"kubernetes.io/projected/79c46d79-aa47-428c-abec-a6f94c66e9ab-kube-api-access-g9csz\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.436983 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:51 crc kubenswrapper[4816]: I0311 12:17:51.432514 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67754df655-hhz24"] Mar 11 12:17:51 crc kubenswrapper[4816]: W0311 12:17:51.437161 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79c46d79_aa47_428c_abec_a6f94c66e9ab.slice/crio-81fe766129ca8c4e2e7c56bfe354248ee38599d3499418773992e5736bf81df2 WatchSource:0}: Error finding container 81fe766129ca8c4e2e7c56bfe354248ee38599d3499418773992e5736bf81df2: Status 404 returned error can't find the container with id 81fe766129ca8c4e2e7c56bfe354248ee38599d3499418773992e5736bf81df2 Mar 11 12:17:51 crc kubenswrapper[4816]: I0311 12:17:51.721712 4816 generic.go:334] "Generic (PLEG): container finished" podID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerID="7cfd948e58ca0b33af11396daaf98403ef86af1a5fd0724d0ce0200e144ab4fe" exitCode=0 Mar 11 12:17:51 crc kubenswrapper[4816]: I0311 12:17:51.721763 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-hhz24" event={"ID":"79c46d79-aa47-428c-abec-a6f94c66e9ab","Type":"ContainerDied","Data":"7cfd948e58ca0b33af11396daaf98403ef86af1a5fd0724d0ce0200e144ab4fe"} Mar 11 12:17:51 crc kubenswrapper[4816]: I0311 12:17:51.721889 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-hhz24" event={"ID":"79c46d79-aa47-428c-abec-a6f94c66e9ab","Type":"ContainerStarted","Data":"81fe766129ca8c4e2e7c56bfe354248ee38599d3499418773992e5736bf81df2"} Mar 11 12:17:52 crc kubenswrapper[4816]: I0311 12:17:52.730569 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-hhz24" event={"ID":"79c46d79-aa47-428c-abec-a6f94c66e9ab","Type":"ContainerStarted","Data":"e929e8e02a375fef6457bbafa642c02c68d821bc19103c5cffa50d761cc569e2"} Mar 11 12:17:52 crc kubenswrapper[4816]: I0311 12:17:52.731084 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:52 crc kubenswrapper[4816]: I0311 12:17:52.754591 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67754df655-hhz24" podStartSLOduration=2.7545667160000002 podStartE2EDuration="2.754566716s" podCreationTimestamp="2026-03-11 12:17:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:52.751366923 +0000 UTC m=+1159.342630890" watchObservedRunningTime="2026-03-11 12:17:52.754566716 +0000 UTC m=+1159.345830703" Mar 11 12:17:55 crc kubenswrapper[4816]: I0311 12:17:55.826479 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:17:56 crc kubenswrapper[4816]: I0311 12:17:56.309597 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 11 12:17:57 crc kubenswrapper[4816]: I0311 12:17:57.964245 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-xm9d9"] Mar 11 12:17:57 crc kubenswrapper[4816]: I0311 12:17:57.972232 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.016194 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xm9d9"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.049796 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwfp8\" (UniqueName: \"kubernetes.io/projected/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-kube-api-access-dwfp8\") pod \"cinder-db-create-xm9d9\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.049897 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-operator-scripts\") pod \"cinder-db-create-xm9d9\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.074155 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4bcf-account-create-update-gkcsc"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.077792 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.145738 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.151949 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-operator-scripts\") pod \"cinder-db-create-xm9d9\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.152075 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwfp8\" (UniqueName: \"kubernetes.io/projected/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-kube-api-access-dwfp8\") pod \"cinder-db-create-xm9d9\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.153003 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-operator-scripts\") pod \"cinder-db-create-xm9d9\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.166585 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4bcf-account-create-update-gkcsc"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.185010 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwfp8\" (UniqueName: \"kubernetes.io/projected/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-kube-api-access-dwfp8\") pod \"cinder-db-create-xm9d9\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.253573 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-operator-scripts\") pod \"cinder-4bcf-account-create-update-gkcsc\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.253640 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6fcs\" (UniqueName: \"kubernetes.io/projected/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-kube-api-access-k6fcs\") pod \"cinder-4bcf-account-create-update-gkcsc\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.281178 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-cnlpc"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.282372 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.295468 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cnlpc"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.326748 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.334938 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kbmsk"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.336048 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.341170 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kbmsk"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.343091 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8k5jj" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.343306 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.343429 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.343568 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.355963 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-operator-scripts\") pod \"cinder-4bcf-account-create-update-gkcsc\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.356057 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6fcs\" (UniqueName: \"kubernetes.io/projected/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-kube-api-access-k6fcs\") pod \"cinder-4bcf-account-create-update-gkcsc\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.357190 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-operator-scripts\") pod \"cinder-4bcf-account-create-update-gkcsc\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.383511 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6fcs\" (UniqueName: \"kubernetes.io/projected/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-kube-api-access-k6fcs\") pod \"cinder-4bcf-account-create-update-gkcsc\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.410169 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-963f-account-create-update-w2lrf"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.411240 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.414780 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.429001 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-963f-account-create-update-w2lrf"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.460826 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-operator-scripts\") pod \"neutron-963f-account-create-update-w2lrf\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.460928 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-config-data\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.461004 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a25abdc0-8516-4747-a589-78db9bc64ca3-operator-scripts\") pod \"barbican-db-create-cnlpc\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.461043 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk4zq\" (UniqueName: \"kubernetes.io/projected/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-kube-api-access-zk4zq\") pod \"neutron-963f-account-create-update-w2lrf\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.461077 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9c6m\" (UniqueName: \"kubernetes.io/projected/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-kube-api-access-x9c6m\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.461223 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-combined-ca-bundle\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.461416 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbxfx\" (UniqueName: \"kubernetes.io/projected/a25abdc0-8516-4747-a589-78db9bc64ca3-kube-api-access-pbxfx\") pod \"barbican-db-create-cnlpc\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.476979 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.501926 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4kpfn"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.503322 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.548765 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4kpfn"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.563718 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-operator-scripts\") pod \"neutron-963f-account-create-update-w2lrf\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.563808 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-config-data\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.563846 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a25abdc0-8516-4747-a589-78db9bc64ca3-operator-scripts\") pod \"barbican-db-create-cnlpc\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.563871 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk4zq\" (UniqueName: \"kubernetes.io/projected/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-kube-api-access-zk4zq\") pod \"neutron-963f-account-create-update-w2lrf\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.563895 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9c6m\" (UniqueName: \"kubernetes.io/projected/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-kube-api-access-x9c6m\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.563930 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-combined-ca-bundle\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.563964 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbxfx\" (UniqueName: \"kubernetes.io/projected/a25abdc0-8516-4747-a589-78db9bc64ca3-kube-api-access-pbxfx\") pod \"barbican-db-create-cnlpc\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.565376 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a25abdc0-8516-4747-a589-78db9bc64ca3-operator-scripts\") pod \"barbican-db-create-cnlpc\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.566044 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-operator-scripts\") pod \"neutron-963f-account-create-update-w2lrf\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.576416 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-config-data\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.586146 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-combined-ca-bundle\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.594628 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4a8d-account-create-update-gxhxz"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.596035 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.596750 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9c6m\" (UniqueName: \"kubernetes.io/projected/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-kube-api-access-x9c6m\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.597270 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk4zq\" (UniqueName: \"kubernetes.io/projected/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-kube-api-access-zk4zq\") pod \"neutron-963f-account-create-update-w2lrf\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.601212 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.601514 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbxfx\" (UniqueName: \"kubernetes.io/projected/a25abdc0-8516-4747-a589-78db9bc64ca3-kube-api-access-pbxfx\") pod \"barbican-db-create-cnlpc\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.618410 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.627636 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4a8d-account-create-update-gxhxz"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.666500 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm2m6\" (UniqueName: \"kubernetes.io/projected/66951176-170f-4d49-9a92-aeeb66f4a79c-kube-api-access-zm2m6\") pod \"neutron-db-create-4kpfn\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.666545 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66951176-170f-4d49-9a92-aeeb66f4a79c-operator-scripts\") pod \"neutron-db-create-4kpfn\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.737913 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.748994 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.770760 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2dkm\" (UniqueName: \"kubernetes.io/projected/27a1317c-41a6-4589-949b-e422d7fe8837-kube-api-access-w2dkm\") pod \"barbican-4a8d-account-create-update-gxhxz\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.771135 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm2m6\" (UniqueName: \"kubernetes.io/projected/66951176-170f-4d49-9a92-aeeb66f4a79c-kube-api-access-zm2m6\") pod \"neutron-db-create-4kpfn\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.771163 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66951176-170f-4d49-9a92-aeeb66f4a79c-operator-scripts\") pod \"neutron-db-create-4kpfn\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.771201 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a1317c-41a6-4589-949b-e422d7fe8837-operator-scripts\") pod \"barbican-4a8d-account-create-update-gxhxz\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.771980 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66951176-170f-4d49-9a92-aeeb66f4a79c-operator-scripts\") pod \"neutron-db-create-4kpfn\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.788329 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm2m6\" (UniqueName: \"kubernetes.io/projected/66951176-170f-4d49-9a92-aeeb66f4a79c-kube-api-access-zm2m6\") pod \"neutron-db-create-4kpfn\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.874407 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2dkm\" (UniqueName: \"kubernetes.io/projected/27a1317c-41a6-4589-949b-e422d7fe8837-kube-api-access-w2dkm\") pod \"barbican-4a8d-account-create-update-gxhxz\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.874529 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a1317c-41a6-4589-949b-e422d7fe8837-operator-scripts\") pod \"barbican-4a8d-account-create-update-gxhxz\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.875224 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a1317c-41a6-4589-949b-e422d7fe8837-operator-scripts\") pod \"barbican-4a8d-account-create-update-gxhxz\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.909923 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2dkm\" (UniqueName: \"kubernetes.io/projected/27a1317c-41a6-4589-949b-e422d7fe8837-kube-api-access-w2dkm\") pod \"barbican-4a8d-account-create-update-gxhxz\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.925793 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.947166 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.037707 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xm9d9"] Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.147901 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4bcf-account-create-update-gkcsc"] Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.259091 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cnlpc"] Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.329266 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-963f-account-create-update-w2lrf"] Mar 11 12:17:59 crc kubenswrapper[4816]: W0311 12:17:59.478689 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45e6e9e0_bfd4_4e8d_823b_9e2bfdfe6d56.slice/crio-0f22b75107e8cd42721b969d6ddcf352087537aee228dbbf09d44200f875243e WatchSource:0}: Error finding container 0f22b75107e8cd42721b969d6ddcf352087537aee228dbbf09d44200f875243e: Status 404 returned error can't find the container with id 0f22b75107e8cd42721b969d6ddcf352087537aee228dbbf09d44200f875243e Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.502612 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kbmsk"] Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.629214 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4kpfn"] Mar 11 12:17:59 crc kubenswrapper[4816]: W0311 12:17:59.630044 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66951176_170f_4d49_9a92_aeeb66f4a79c.slice/crio-f1a99dc130938800f8f6978214fdd8b7e688157675e3e3050193a9527e3f2bb1 WatchSource:0}: Error finding container f1a99dc130938800f8f6978214fdd8b7e688157675e3e3050193a9527e3f2bb1: Status 404 returned error can't find the container with id f1a99dc130938800f8f6978214fdd8b7e688157675e3e3050193a9527e3f2bb1 Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.646179 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4a8d-account-create-update-gxhxz"] Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.796053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4bcf-account-create-update-gkcsc" event={"ID":"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16","Type":"ContainerStarted","Data":"369058cbb8fa5b6fcca641d0c6bacd8fb984decb4576950458c2fae4a2d14692"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.796102 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4bcf-account-create-update-gkcsc" event={"ID":"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16","Type":"ContainerStarted","Data":"e8ca11c82565ae3f35fb6628a625997e61984edf2584bf5b6f01f77d24b2ea45"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.801088 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4a8d-account-create-update-gxhxz" event={"ID":"27a1317c-41a6-4589-949b-e422d7fe8837","Type":"ContainerStarted","Data":"04ba277b83a2833a8a1e2667f16776d5e0146b956d56c20da2b18a82224445d9"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.819944 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4kpfn" event={"ID":"66951176-170f-4d49-9a92-aeeb66f4a79c","Type":"ContainerStarted","Data":"f1a99dc130938800f8f6978214fdd8b7e688157675e3e3050193a9527e3f2bb1"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.824806 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-4bcf-account-create-update-gkcsc" podStartSLOduration=1.824786458 podStartE2EDuration="1.824786458s" podCreationTimestamp="2026-03-11 12:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:59.817676592 +0000 UTC m=+1166.408940559" watchObservedRunningTime="2026-03-11 12:17:59.824786458 +0000 UTC m=+1166.416050415" Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.826735 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xm9d9" event={"ID":"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac","Type":"ContainerStarted","Data":"5d5e0febf80ed4282e61f8380eff77836a90544898e5ff129bf2d82dd15449ea"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.826762 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xm9d9" event={"ID":"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac","Type":"ContainerStarted","Data":"29a2d35d034352fc106b2d5ac30a149571b3d9b03ee18c6197d6c9b89fe24636"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.831102 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cnlpc" event={"ID":"a25abdc0-8516-4747-a589-78db9bc64ca3","Type":"ContainerStarted","Data":"19aee09219637e7a5ab326ab09421619dc187f94f0843e938baaa3e47920a542"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.831130 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cnlpc" event={"ID":"a25abdc0-8516-4747-a589-78db9bc64ca3","Type":"ContainerStarted","Data":"dbe8001a30913aacdd0f84f1edb48ad5e6ee0a7f33e24c58a948426697a06086"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.834599 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-963f-account-create-update-w2lrf" event={"ID":"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6","Type":"ContainerStarted","Data":"16af7949a711342a3610523f5b8fbb074d336f04c7a6eb010f9128a10368ad76"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.834628 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-963f-account-create-update-w2lrf" event={"ID":"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6","Type":"ContainerStarted","Data":"b6cfc328bc6a5ec19bbd297d5021170486b6ed13ec54ef1b30a52f9abba8f3f9"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.846764 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kbmsk" event={"ID":"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56","Type":"ContainerStarted","Data":"0f22b75107e8cd42721b969d6ddcf352087537aee228dbbf09d44200f875243e"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.855367 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n98v5" event={"ID":"b6745bae-b403-4a86-9148-8baecc00f8b1","Type":"ContainerStarted","Data":"e5f24ad51eefb627e15014c3582b64a13468c820d9ad9ccfa53acd2f0fb30054"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.877242 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-963f-account-create-update-w2lrf" podStartSLOduration=1.877208901 podStartE2EDuration="1.877208901s" podCreationTimestamp="2026-03-11 12:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:59.86993715 +0000 UTC m=+1166.461201117" watchObservedRunningTime="2026-03-11 12:17:59.877208901 +0000 UTC m=+1166.468472868" Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.931616 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-n98v5" podStartSLOduration=2.134032273 podStartE2EDuration="35.931590961s" podCreationTimestamp="2026-03-11 12:17:24 +0000 UTC" firstStartedPulling="2026-03-11 12:17:25.030210096 +0000 UTC m=+1131.621474063" lastFinishedPulling="2026-03-11 12:17:58.827768784 +0000 UTC m=+1165.419032751" observedRunningTime="2026-03-11 12:17:59.925739991 +0000 UTC m=+1166.517003958" watchObservedRunningTime="2026-03-11 12:17:59.931590961 +0000 UTC m=+1166.522854928" Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.940639 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-cnlpc" podStartSLOduration=1.940619343 podStartE2EDuration="1.940619343s" podCreationTimestamp="2026-03-11 12:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:59.905854673 +0000 UTC m=+1166.497118640" watchObservedRunningTime="2026-03-11 12:17:59.940619343 +0000 UTC m=+1166.531883310" Mar 11 12:18:00 crc kubenswrapper[4816]: E0311 12:18:00.074616 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda25abdc0_8516_4747_a589_78db9bc64ca3.slice/crio-conmon-19aee09219637e7a5ab326ab09421619dc187f94f0843e938baaa3e47920a542.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode82cb42a_5dbf_43d1_a71c_18b3e6d252d6.slice/crio-16af7949a711342a3610523f5b8fbb074d336f04c7a6eb010f9128a10368ad76.scope\": RecentStats: unable to find data in memory cache]" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.186330 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553858-brk44"] Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.187338 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553858-brk44"] Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.187420 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553858-brk44" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.192502 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.192673 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.192569 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.316585 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6ppf\" (UniqueName: \"kubernetes.io/projected/99d8cc8e-8af3-41b3-bb8c-6e4e10f00193-kube-api-access-q6ppf\") pod \"auto-csr-approver-29553858-brk44\" (UID: \"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193\") " pod="openshift-infra/auto-csr-approver-29553858-brk44" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.418615 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6ppf\" (UniqueName: \"kubernetes.io/projected/99d8cc8e-8af3-41b3-bb8c-6e4e10f00193-kube-api-access-q6ppf\") pod \"auto-csr-approver-29553858-brk44\" (UID: \"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193\") " pod="openshift-infra/auto-csr-approver-29553858-brk44" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.443357 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.458007 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6ppf\" (UniqueName: \"kubernetes.io/projected/99d8cc8e-8af3-41b3-bb8c-6e4e10f00193-kube-api-access-q6ppf\") pod \"auto-csr-approver-29553858-brk44\" (UID: \"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193\") " pod="openshift-infra/auto-csr-approver-29553858-brk44" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.550737 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-wqn2t"] Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.551585 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" podUID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerName="dnsmasq-dns" containerID="cri-o://f1a234613505f291637cb739619dbef7845308ac22057594b971bae3924f2dc7" gracePeriod=10 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.612105 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553858-brk44" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.884695 4816 generic.go:334] "Generic (PLEG): container finished" podID="b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac" containerID="5d5e0febf80ed4282e61f8380eff77836a90544898e5ff129bf2d82dd15449ea" exitCode=0 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.884843 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xm9d9" event={"ID":"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac","Type":"ContainerDied","Data":"5d5e0febf80ed4282e61f8380eff77836a90544898e5ff129bf2d82dd15449ea"} Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.891975 4816 generic.go:334] "Generic (PLEG): container finished" podID="a25abdc0-8516-4747-a589-78db9bc64ca3" containerID="19aee09219637e7a5ab326ab09421619dc187f94f0843e938baaa3e47920a542" exitCode=0 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.892083 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cnlpc" event={"ID":"a25abdc0-8516-4747-a589-78db9bc64ca3","Type":"ContainerDied","Data":"19aee09219637e7a5ab326ab09421619dc187f94f0843e938baaa3e47920a542"} Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.907791 4816 generic.go:334] "Generic (PLEG): container finished" podID="e82cb42a-5dbf-43d1-a71c-18b3e6d252d6" containerID="16af7949a711342a3610523f5b8fbb074d336f04c7a6eb010f9128a10368ad76" exitCode=0 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.907894 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-963f-account-create-update-w2lrf" event={"ID":"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6","Type":"ContainerDied","Data":"16af7949a711342a3610523f5b8fbb074d336f04c7a6eb010f9128a10368ad76"} Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.910597 4816 generic.go:334] "Generic (PLEG): container finished" podID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerID="f1a234613505f291637cb739619dbef7845308ac22057594b971bae3924f2dc7" exitCode=0 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.910702 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" event={"ID":"bcc1a78b-c3d2-4c15-81a0-0431da953e51","Type":"ContainerDied","Data":"f1a234613505f291637cb739619dbef7845308ac22057594b971bae3924f2dc7"} Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.917125 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16" containerID="369058cbb8fa5b6fcca641d0c6bacd8fb984decb4576950458c2fae4a2d14692" exitCode=0 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.917272 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4bcf-account-create-update-gkcsc" event={"ID":"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16","Type":"ContainerDied","Data":"369058cbb8fa5b6fcca641d0c6bacd8fb984decb4576950458c2fae4a2d14692"} Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.919037 4816 generic.go:334] "Generic (PLEG): container finished" podID="27a1317c-41a6-4589-949b-e422d7fe8837" containerID="ae9a5cdf2df1a6846c30df048ff752db89454e8f6330fe73c2c82145d550960b" exitCode=0 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.919104 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4a8d-account-create-update-gxhxz" event={"ID":"27a1317c-41a6-4589-949b-e422d7fe8837","Type":"ContainerDied","Data":"ae9a5cdf2df1a6846c30df048ff752db89454e8f6330fe73c2c82145d550960b"} Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.925146 4816 generic.go:334] "Generic (PLEG): container finished" podID="66951176-170f-4d49-9a92-aeeb66f4a79c" containerID="234b74962788658b9515670058c8f55bb2409a552461ddec719b37310c8f7e0d" exitCode=0 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.925200 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4kpfn" event={"ID":"66951176-170f-4d49-9a92-aeeb66f4a79c","Type":"ContainerDied","Data":"234b74962788658b9515670058c8f55bb2409a552461ddec719b37310c8f7e0d"} Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.082579 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.188628 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553858-brk44"] Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.254524 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-sb\") pod \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.254730 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-dns-svc\") pod \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.254816 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-config\") pod \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.254899 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-nb\") pod \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.255010 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwfrm\" (UniqueName: \"kubernetes.io/projected/bcc1a78b-c3d2-4c15-81a0-0431da953e51-kube-api-access-jwfrm\") pod \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.266305 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc1a78b-c3d2-4c15-81a0-0431da953e51-kube-api-access-jwfrm" (OuterVolumeSpecName: "kube-api-access-jwfrm") pod "bcc1a78b-c3d2-4c15-81a0-0431da953e51" (UID: "bcc1a78b-c3d2-4c15-81a0-0431da953e51"). InnerVolumeSpecName "kube-api-access-jwfrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.320240 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-config" (OuterVolumeSpecName: "config") pod "bcc1a78b-c3d2-4c15-81a0-0431da953e51" (UID: "bcc1a78b-c3d2-4c15-81a0-0431da953e51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.320731 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bcc1a78b-c3d2-4c15-81a0-0431da953e51" (UID: "bcc1a78b-c3d2-4c15-81a0-0431da953e51"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.325148 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bcc1a78b-c3d2-4c15-81a0-0431da953e51" (UID: "bcc1a78b-c3d2-4c15-81a0-0431da953e51"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.335174 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bcc1a78b-c3d2-4c15-81a0-0431da953e51" (UID: "bcc1a78b-c3d2-4c15-81a0-0431da953e51"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.358512 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.358549 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.358563 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwfrm\" (UniqueName: \"kubernetes.io/projected/bcc1a78b-c3d2-4c15-81a0-0431da953e51-kube-api-access-jwfrm\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.358574 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.358583 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.402028 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xm9d9" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.562019 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwfp8\" (UniqueName: \"kubernetes.io/projected/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-kube-api-access-dwfp8\") pod \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.562232 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-operator-scripts\") pod \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.563104 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac" (UID: "b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.568024 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-kube-api-access-dwfp8" (OuterVolumeSpecName: "kube-api-access-dwfp8") pod "b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac" (UID: "b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac"). InnerVolumeSpecName "kube-api-access-dwfp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.664909 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.664946 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwfp8\" (UniqueName: \"kubernetes.io/projected/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-kube-api-access-dwfp8\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.935790 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553858-brk44" event={"ID":"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193","Type":"ContainerStarted","Data":"eca476c37d977dc52f8c2a3d4b650eb7ecd907ca671efa005aedc89c638e9ac8"} Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.937544 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xm9d9" event={"ID":"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac","Type":"ContainerDied","Data":"29a2d35d034352fc106b2d5ac30a149571b3d9b03ee18c6197d6c9b89fe24636"} Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.937730 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29a2d35d034352fc106b2d5ac30a149571b3d9b03ee18c6197d6c9b89fe24636" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.937954 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xm9d9" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.946802 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" event={"ID":"bcc1a78b-c3d2-4c15-81a0-0431da953e51","Type":"ContainerDied","Data":"ba3c97adc7cc798d326f3771649f02fd21d888d95dfd0aad666803c28f5b240b"} Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.946872 4816 scope.go:117] "RemoveContainer" containerID="f1a234613505f291637cb739619dbef7845308ac22057594b971bae3924f2dc7" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.947307 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.006324 4816 scope.go:117] "RemoveContainer" containerID="035e208fb3e5fc9b968f1db57d46e9bd63d57178d448cbc27d1282a58427f605" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.010353 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-wqn2t"] Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.021092 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-wqn2t"] Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.148064 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" path="/var/lib/kubelet/pods/bcc1a78b-c3d2-4c15-81a0-0431da953e51/volumes" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.414725 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.557305 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.569031 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cnlpc" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.571406 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.582440 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6fcs\" (UniqueName: \"kubernetes.io/projected/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-kube-api-access-k6fcs\") pod \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.582584 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-operator-scripts\") pod \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.584149 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16" (UID: "f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.598544 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4kpfn" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.603591 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-kube-api-access-k6fcs" (OuterVolumeSpecName: "kube-api-access-k6fcs") pod "f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16" (UID: "f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16"). InnerVolumeSpecName "kube-api-access-k6fcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.683910 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-operator-scripts\") pod \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.684014 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk4zq\" (UniqueName: \"kubernetes.io/projected/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-kube-api-access-zk4zq\") pod \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.684086 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a1317c-41a6-4589-949b-e422d7fe8837-operator-scripts\") pod \"27a1317c-41a6-4589-949b-e422d7fe8837\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.684123 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a25abdc0-8516-4747-a589-78db9bc64ca3-operator-scripts\") pod \"a25abdc0-8516-4747-a589-78db9bc64ca3\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.684454 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e82cb42a-5dbf-43d1-a71c-18b3e6d252d6" (UID: "e82cb42a-5dbf-43d1-a71c-18b3e6d252d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.684585 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2dkm\" (UniqueName: \"kubernetes.io/projected/27a1317c-41a6-4589-949b-e422d7fe8837-kube-api-access-w2dkm\") pod \"27a1317c-41a6-4589-949b-e422d7fe8837\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.684630 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbxfx\" (UniqueName: \"kubernetes.io/projected/a25abdc0-8516-4747-a589-78db9bc64ca3-kube-api-access-pbxfx\") pod \"a25abdc0-8516-4747-a589-78db9bc64ca3\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.685084 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6fcs\" (UniqueName: \"kubernetes.io/projected/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-kube-api-access-k6fcs\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.685167 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.685175 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.685620 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a25abdc0-8516-4747-a589-78db9bc64ca3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a25abdc0-8516-4747-a589-78db9bc64ca3" (UID: "a25abdc0-8516-4747-a589-78db9bc64ca3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.686228 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a1317c-41a6-4589-949b-e422d7fe8837-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27a1317c-41a6-4589-949b-e422d7fe8837" (UID: "27a1317c-41a6-4589-949b-e422d7fe8837"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.688356 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a25abdc0-8516-4747-a589-78db9bc64ca3-kube-api-access-pbxfx" (OuterVolumeSpecName: "kube-api-access-pbxfx") pod "a25abdc0-8516-4747-a589-78db9bc64ca3" (UID: "a25abdc0-8516-4747-a589-78db9bc64ca3"). InnerVolumeSpecName "kube-api-access-pbxfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.688896 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-kube-api-access-zk4zq" (OuterVolumeSpecName: "kube-api-access-zk4zq") pod "e82cb42a-5dbf-43d1-a71c-18b3e6d252d6" (UID: "e82cb42a-5dbf-43d1-a71c-18b3e6d252d6"). InnerVolumeSpecName "kube-api-access-zk4zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.689383 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a1317c-41a6-4589-949b-e422d7fe8837-kube-api-access-w2dkm" (OuterVolumeSpecName: "kube-api-access-w2dkm") pod "27a1317c-41a6-4589-949b-e422d7fe8837" (UID: "27a1317c-41a6-4589-949b-e422d7fe8837"). InnerVolumeSpecName "kube-api-access-w2dkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.787754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66951176-170f-4d49-9a92-aeeb66f4a79c-operator-scripts\") pod \"66951176-170f-4d49-9a92-aeeb66f4a79c\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.788006 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm2m6\" (UniqueName: \"kubernetes.io/projected/66951176-170f-4d49-9a92-aeeb66f4a79c-kube-api-access-zm2m6\") pod \"66951176-170f-4d49-9a92-aeeb66f4a79c\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.788585 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a25abdc0-8516-4747-a589-78db9bc64ca3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.788617 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2dkm\" (UniqueName: \"kubernetes.io/projected/27a1317c-41a6-4589-949b-e422d7fe8837-kube-api-access-w2dkm\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.788633 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbxfx\" (UniqueName: \"kubernetes.io/projected/a25abdc0-8516-4747-a589-78db9bc64ca3-kube-api-access-pbxfx\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.788647 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk4zq\" (UniqueName: \"kubernetes.io/projected/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-kube-api-access-zk4zq\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.788661 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a1317c-41a6-4589-949b-e422d7fe8837-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.788648 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66951176-170f-4d49-9a92-aeeb66f4a79c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66951176-170f-4d49-9a92-aeeb66f4a79c" (UID: "66951176-170f-4d49-9a92-aeeb66f4a79c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.792367 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66951176-170f-4d49-9a92-aeeb66f4a79c-kube-api-access-zm2m6" (OuterVolumeSpecName: "kube-api-access-zm2m6") pod "66951176-170f-4d49-9a92-aeeb66f4a79c" (UID: "66951176-170f-4d49-9a92-aeeb66f4a79c"). InnerVolumeSpecName "kube-api-access-zm2m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.890329 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm2m6\" (UniqueName: \"kubernetes.io/projected/66951176-170f-4d49-9a92-aeeb66f4a79c-kube-api-access-zm2m6\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.890379 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66951176-170f-4d49-9a92-aeeb66f4a79c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.959092 4816 generic.go:334] "Generic (PLEG): container finished" podID="99d8cc8e-8af3-41b3-bb8c-6e4e10f00193" containerID="0d27f73615e32aa404576eea9593c729502e37fe26b5c92717c4bee0b43a98e6" exitCode=0 Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.959906 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553858-brk44" event={"ID":"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193","Type":"ContainerDied","Data":"0d27f73615e32aa404576eea9593c729502e37fe26b5c92717c4bee0b43a98e6"} Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.967572 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4bcf-account-create-update-gkcsc" event={"ID":"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16","Type":"ContainerDied","Data":"e8ca11c82565ae3f35fb6628a625997e61984edf2584bf5b6f01f77d24b2ea45"} Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.967617 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8ca11c82565ae3f35fb6628a625997e61984edf2584bf5b6f01f77d24b2ea45" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.967686 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.981620 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4a8d-account-create-update-gxhxz" event={"ID":"27a1317c-41a6-4589-949b-e422d7fe8837","Type":"ContainerDied","Data":"04ba277b83a2833a8a1e2667f16776d5e0146b956d56c20da2b18a82224445d9"} Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.981668 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04ba277b83a2833a8a1e2667f16776d5e0146b956d56c20da2b18a82224445d9" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.981733 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.985905 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4kpfn" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.987052 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4kpfn" event={"ID":"66951176-170f-4d49-9a92-aeeb66f4a79c","Type":"ContainerDied","Data":"f1a99dc130938800f8f6978214fdd8b7e688157675e3e3050193a9527e3f2bb1"} Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.987141 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1a99dc130938800f8f6978214fdd8b7e688157675e3e3050193a9527e3f2bb1" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.989214 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cnlpc" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.989232 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cnlpc" event={"ID":"a25abdc0-8516-4747-a589-78db9bc64ca3","Type":"ContainerDied","Data":"dbe8001a30913aacdd0f84f1edb48ad5e6ee0a7f33e24c58a948426697a06086"} Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.989455 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbe8001a30913aacdd0f84f1edb48ad5e6ee0a7f33e24c58a948426697a06086" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.992319 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-963f-account-create-update-w2lrf" event={"ID":"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6","Type":"ContainerDied","Data":"b6cfc328bc6a5ec19bbd297d5021170486b6ed13ec54ef1b30a52f9abba8f3f9"} Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.992358 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6cfc328bc6a5ec19bbd297d5021170486b6ed13ec54ef1b30a52f9abba8f3f9" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.992359 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:18:05 crc kubenswrapper[4816]: I0311 12:18:05.885514 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553858-brk44" Mar 11 12:18:06 crc kubenswrapper[4816]: I0311 12:18:06.033584 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553858-brk44" event={"ID":"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193","Type":"ContainerDied","Data":"eca476c37d977dc52f8c2a3d4b650eb7ecd907ca671efa005aedc89c638e9ac8"} Mar 11 12:18:06 crc kubenswrapper[4816]: I0311 12:18:06.034217 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eca476c37d977dc52f8c2a3d4b650eb7ecd907ca671efa005aedc89c638e9ac8" Mar 11 12:18:06 crc kubenswrapper[4816]: I0311 12:18:06.033879 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553858-brk44" Mar 11 12:18:06 crc kubenswrapper[4816]: I0311 12:18:06.059849 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6ppf\" (UniqueName: \"kubernetes.io/projected/99d8cc8e-8af3-41b3-bb8c-6e4e10f00193-kube-api-access-q6ppf\") pod \"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193\" (UID: \"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193\") " Mar 11 12:18:06 crc kubenswrapper[4816]: I0311 12:18:06.065900 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d8cc8e-8af3-41b3-bb8c-6e4e10f00193-kube-api-access-q6ppf" (OuterVolumeSpecName: "kube-api-access-q6ppf") pod "99d8cc8e-8af3-41b3-bb8c-6e4e10f00193" (UID: "99d8cc8e-8af3-41b3-bb8c-6e4e10f00193"). InnerVolumeSpecName "kube-api-access-q6ppf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:06 crc kubenswrapper[4816]: I0311 12:18:06.161804 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6ppf\" (UniqueName: \"kubernetes.io/projected/99d8cc8e-8af3-41b3-bb8c-6e4e10f00193-kube-api-access-q6ppf\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:06 crc kubenswrapper[4816]: I0311 12:18:06.991890 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553852-tvtrs"] Mar 11 12:18:07 crc kubenswrapper[4816]: I0311 12:18:07.023667 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553852-tvtrs"] Mar 11 12:18:07 crc kubenswrapper[4816]: I0311 12:18:07.075045 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kbmsk" event={"ID":"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56","Type":"ContainerStarted","Data":"5f7cdb31826f59ca1238a145635210cb534eb8b83a42327083142e83ef21c961"} Mar 11 12:18:07 crc kubenswrapper[4816]: I0311 12:18:07.077354 4816 generic.go:334] "Generic (PLEG): container finished" podID="b6745bae-b403-4a86-9148-8baecc00f8b1" containerID="e5f24ad51eefb627e15014c3582b64a13468c820d9ad9ccfa53acd2f0fb30054" exitCode=0 Mar 11 12:18:07 crc kubenswrapper[4816]: I0311 12:18:07.077410 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n98v5" event={"ID":"b6745bae-b403-4a86-9148-8baecc00f8b1","Type":"ContainerDied","Data":"e5f24ad51eefb627e15014c3582b64a13468c820d9ad9ccfa53acd2f0fb30054"} Mar 11 12:18:07 crc kubenswrapper[4816]: I0311 12:18:07.091777 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kbmsk" podStartSLOduration=2.654693612 podStartE2EDuration="9.091752753s" podCreationTimestamp="2026-03-11 12:17:58 +0000 UTC" firstStartedPulling="2026-03-11 12:17:59.482268908 +0000 UTC m=+1166.073532875" lastFinishedPulling="2026-03-11 12:18:05.919328049 +0000 UTC m=+1172.510592016" observedRunningTime="2026-03-11 12:18:07.090139976 +0000 UTC m=+1173.681403943" watchObservedRunningTime="2026-03-11 12:18:07.091752753 +0000 UTC m=+1173.683016720" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.145767 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1d15245-e206-4f60-a05c-9888a45a1aca" path="/var/lib/kubelet/pods/f1d15245-e206-4f60-a05c-9888a45a1aca/volumes" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.519349 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n98v5" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.712477 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-combined-ca-bundle\") pod \"b6745bae-b403-4a86-9148-8baecc00f8b1\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.712555 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-config-data\") pod \"b6745bae-b403-4a86-9148-8baecc00f8b1\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.712709 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pghjx\" (UniqueName: \"kubernetes.io/projected/b6745bae-b403-4a86-9148-8baecc00f8b1-kube-api-access-pghjx\") pod \"b6745bae-b403-4a86-9148-8baecc00f8b1\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.712810 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-db-sync-config-data\") pod \"b6745bae-b403-4a86-9148-8baecc00f8b1\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.719520 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b6745bae-b403-4a86-9148-8baecc00f8b1" (UID: "b6745bae-b403-4a86-9148-8baecc00f8b1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.719607 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6745bae-b403-4a86-9148-8baecc00f8b1-kube-api-access-pghjx" (OuterVolumeSpecName: "kube-api-access-pghjx") pod "b6745bae-b403-4a86-9148-8baecc00f8b1" (UID: "b6745bae-b403-4a86-9148-8baecc00f8b1"). InnerVolumeSpecName "kube-api-access-pghjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.741456 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6745bae-b403-4a86-9148-8baecc00f8b1" (UID: "b6745bae-b403-4a86-9148-8baecc00f8b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.762782 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-config-data" (OuterVolumeSpecName: "config-data") pod "b6745bae-b403-4a86-9148-8baecc00f8b1" (UID: "b6745bae-b403-4a86-9148-8baecc00f8b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.815955 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pghjx\" (UniqueName: \"kubernetes.io/projected/b6745bae-b403-4a86-9148-8baecc00f8b1-kube-api-access-pghjx\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.816041 4816 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.816077 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.816104 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.097773 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n98v5" event={"ID":"b6745bae-b403-4a86-9148-8baecc00f8b1","Type":"ContainerDied","Data":"7ade065f6f708de586323f677e56810ada0b99da337e5a079b57da2cc0b0b5a0"} Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.098210 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ade065f6f708de586323f677e56810ada0b99da337e5a079b57da2cc0b0b5a0" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.098299 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n98v5" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.483040 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f88567fd9-qp995"] Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.484702 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerName="dnsmasq-dns" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.484798 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerName="dnsmasq-dns" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.484869 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6745bae-b403-4a86-9148-8baecc00f8b1" containerName="glance-db-sync" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.484938 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6745bae-b403-4a86-9148-8baecc00f8b1" containerName="glance-db-sync" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.485051 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66951176-170f-4d49-9a92-aeeb66f4a79c" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.485117 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="66951176-170f-4d49-9a92-aeeb66f4a79c" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.485185 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d8cc8e-8af3-41b3-bb8c-6e4e10f00193" containerName="oc" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.485273 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d8cc8e-8af3-41b3-bb8c-6e4e10f00193" containerName="oc" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.485358 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25abdc0-8516-4747-a589-78db9bc64ca3" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.485464 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25abdc0-8516-4747-a589-78db9bc64ca3" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.485574 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a1317c-41a6-4589-949b-e422d7fe8837" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.485651 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a1317c-41a6-4589-949b-e422d7fe8837" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.487225 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.487327 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.487389 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerName="init" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.487446 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerName="init" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.487506 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.487568 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.487624 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82cb42a-5dbf-43d1-a71c-18b3e6d252d6" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.487677 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82cb42a-5dbf-43d1-a71c-18b3e6d252d6" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488021 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488101 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a1317c-41a6-4589-949b-e422d7fe8837" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488168 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerName="dnsmasq-dns" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488277 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488345 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d8cc8e-8af3-41b3-bb8c-6e4e10f00193" containerName="oc" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488421 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6745bae-b403-4a86-9148-8baecc00f8b1" containerName="glance-db-sync" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488487 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82cb42a-5dbf-43d1-a71c-18b3e6d252d6" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488554 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="66951176-170f-4d49-9a92-aeeb66f4a79c" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488613 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25abdc0-8516-4747-a589-78db9bc64ca3" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.489738 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.500714 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f88567fd9-qp995"] Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.515107 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.515161 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.630729 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-swift-storage-0\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.630813 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-svc\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.631133 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh48x\" (UniqueName: \"kubernetes.io/projected/a9624e97-8103-4296-b562-982cf05abfec-kube-api-access-zh48x\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.631259 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-sb\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.631433 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-nb\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.631574 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-config\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.733649 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-swift-storage-0\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.733730 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-svc\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.733781 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh48x\" (UniqueName: \"kubernetes.io/projected/a9624e97-8103-4296-b562-982cf05abfec-kube-api-access-zh48x\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.733819 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-sb\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.733848 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-nb\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.733880 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-config\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.734771 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-swift-storage-0\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.734805 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-config\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.735010 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-sb\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.735135 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-nb\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.735308 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-svc\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.766975 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh48x\" (UniqueName: \"kubernetes.io/projected/a9624e97-8103-4296-b562-982cf05abfec-kube-api-access-zh48x\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.815036 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:10 crc kubenswrapper[4816]: I0311 12:18:10.116049 4816 generic.go:334] "Generic (PLEG): container finished" podID="45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" containerID="5f7cdb31826f59ca1238a145635210cb534eb8b83a42327083142e83ef21c961" exitCode=0 Mar 11 12:18:10 crc kubenswrapper[4816]: I0311 12:18:10.116109 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kbmsk" event={"ID":"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56","Type":"ContainerDied","Data":"5f7cdb31826f59ca1238a145635210cb534eb8b83a42327083142e83ef21c961"} Mar 11 12:18:10 crc kubenswrapper[4816]: I0311 12:18:10.299718 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f88567fd9-qp995"] Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.126127 4816 generic.go:334] "Generic (PLEG): container finished" podID="a9624e97-8103-4296-b562-982cf05abfec" containerID="d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85" exitCode=0 Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.126267 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" event={"ID":"a9624e97-8103-4296-b562-982cf05abfec","Type":"ContainerDied","Data":"d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85"} Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.126663 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" event={"ID":"a9624e97-8103-4296-b562-982cf05abfec","Type":"ContainerStarted","Data":"1722f447074997662412f081f41e66350a45168b3ef01991c199f5c589a81402"} Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.447562 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.574832 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-combined-ca-bundle\") pod \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.575092 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9c6m\" (UniqueName: \"kubernetes.io/projected/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-kube-api-access-x9c6m\") pod \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.575138 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-config-data\") pod \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.580607 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-kube-api-access-x9c6m" (OuterVolumeSpecName: "kube-api-access-x9c6m") pod "45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" (UID: "45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56"). InnerVolumeSpecName "kube-api-access-x9c6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.606439 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" (UID: "45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.621264 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-config-data" (OuterVolumeSpecName: "config-data") pod "45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" (UID: "45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.677473 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.677517 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9c6m\" (UniqueName: \"kubernetes.io/projected/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-kube-api-access-x9c6m\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.677529 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.138920 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.141113 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kbmsk" event={"ID":"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56","Type":"ContainerDied","Data":"0f22b75107e8cd42721b969d6ddcf352087537aee228dbbf09d44200f875243e"} Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.141173 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f22b75107e8cd42721b969d6ddcf352087537aee228dbbf09d44200f875243e" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.141303 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" event={"ID":"a9624e97-8103-4296-b562-982cf05abfec","Type":"ContainerStarted","Data":"ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f"} Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.142665 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.176032 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" podStartSLOduration=3.176002976 podStartE2EDuration="3.176002976s" podCreationTimestamp="2026-03-11 12:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:12.167197793 +0000 UTC m=+1178.758461780" watchObservedRunningTime="2026-03-11 12:18:12.176002976 +0000 UTC m=+1178.767266953" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.423986 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f88567fd9-qp995"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.455289 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fvdtl"] Mar 11 12:18:12 crc kubenswrapper[4816]: E0311 12:18:12.455709 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" containerName="keystone-db-sync" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.455730 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" containerName="keystone-db-sync" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.455958 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" containerName="keystone-db-sync" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.456653 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.461035 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.461229 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.461407 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.462600 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8k5jj" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.464130 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.489520 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fvdtl"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.525315 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-frz8f"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.526930 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.597855 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44z2c\" (UniqueName: \"kubernetes.io/projected/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-kube-api-access-44z2c\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.597915 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-fernet-keys\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.597950 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-combined-ca-bundle\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.598004 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-config-data\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.598042 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-credential-keys\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.598108 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-scripts\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.598456 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-frz8f"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.700591 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-config-data\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.700687 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-sb\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.700785 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtjdv\" (UniqueName: \"kubernetes.io/projected/0177cd91-bf8f-4e82-9f8b-5c50118dee09-kube-api-access-vtjdv\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.700841 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-credential-keys\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701506 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-swift-storage-0\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701543 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-scripts\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701568 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-svc\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701646 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44z2c\" (UniqueName: \"kubernetes.io/projected/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-kube-api-access-44z2c\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701680 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-fernet-keys\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701705 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-config\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701728 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-combined-ca-bundle\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701753 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.709697 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-fernet-keys\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.710473 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-config-data\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.712790 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-credential-keys\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.713376 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-fjmnw"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.714935 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.717081 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-scripts\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.718287 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-combined-ca-bundle\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.720835 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.721179 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.721189 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6qw4t" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.726758 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fjmnw"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.774975 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44z2c\" (UniqueName: \"kubernetes.io/projected/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-kube-api-access-44z2c\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802688 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-scripts\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802744 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-config\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802775 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802809 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-sb\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802833 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-combined-ca-bundle\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802853 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtjdv\" (UniqueName: \"kubernetes.io/projected/0177cd91-bf8f-4e82-9f8b-5c50118dee09-kube-api-access-vtjdv\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802873 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-config-data\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802902 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n8zp\" (UniqueName: \"kubernetes.io/projected/2772ef82-fe14-4f4d-8349-8ee515e39979-kube-api-access-5n8zp\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802922 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-db-sync-config-data\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802965 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-swift-storage-0\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802991 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2772ef82-fe14-4f4d-8349-8ee515e39979-etc-machine-id\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.803008 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-svc\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.804145 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-svc\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.805655 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-sb\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.806200 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.807803 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-config\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.823839 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-swift-storage-0\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.846156 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtjdv\" (UniqueName: \"kubernetes.io/projected/0177cd91-bf8f-4e82-9f8b-5c50118dee09-kube-api-access-vtjdv\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.856561 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.859054 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.883725 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.884027 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.893539 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-tdv64"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.894179 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.895756 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.904888 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n8zp\" (UniqueName: \"kubernetes.io/projected/2772ef82-fe14-4f4d-8349-8ee515e39979-kube-api-access-5n8zp\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.904957 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-db-sync-config-data\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.905036 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2772ef82-fe14-4f4d-8349-8ee515e39979-etc-machine-id\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.905106 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-scripts\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.905172 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-combined-ca-bundle\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.905211 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-config-data\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.924883 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-tdv64"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.929159 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-config-data\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.955935 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.956319 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.956583 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-48x47" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.960028 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-scripts\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.980046 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2772ef82-fe14-4f4d-8349-8ee515e39979-etc-machine-id\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.985787 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-db-sync-config-data\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007504 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnpcx\" (UniqueName: \"kubernetes.io/projected/1ebe3f2a-5719-412c-8803-15e1bec74523-kube-api-access-wnpcx\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007556 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-combined-ca-bundle\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007584 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007633 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-run-httpd\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007655 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-scripts\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007768 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-log-httpd\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007807 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-config-data\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007849 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-config\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.008204 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8z8g\" (UniqueName: \"kubernetes.io/projected/3ae20611-891b-49ee-b5b8-0dad8af80906-kube-api-access-z8z8g\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.008332 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.025424 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n8zp\" (UniqueName: \"kubernetes.io/projected/2772ef82-fe14-4f4d-8349-8ee515e39979-kube-api-access-5n8zp\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.038692 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-combined-ca-bundle\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.060636 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.078103 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.087127 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-rjxsf"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.106076 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.119445 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.119965 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fxmtd" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.130484 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-config\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.148035 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8z8g\" (UniqueName: \"kubernetes.io/projected/3ae20611-891b-49ee-b5b8-0dad8af80906-kube-api-access-z8z8g\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.151558 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.171887 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnpcx\" (UniqueName: \"kubernetes.io/projected/1ebe3f2a-5719-412c-8803-15e1bec74523-kube-api-access-wnpcx\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.171964 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-combined-ca-bundle\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.172054 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.172208 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-run-httpd\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.172271 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-scripts\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.172445 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-log-httpd\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.172476 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-config-data\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.180369 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rjxsf"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.151553 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-config\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.182137 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-run-httpd\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.182588 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-log-httpd\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.194700 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-scripts\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.195872 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.223639 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.224323 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnpcx\" (UniqueName: \"kubernetes.io/projected/1ebe3f2a-5719-412c-8803-15e1bec74523-kube-api-access-wnpcx\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.237467 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-combined-ca-bundle\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.241389 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-config-data\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.242330 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8z8g\" (UniqueName: \"kubernetes.io/projected/3ae20611-891b-49ee-b5b8-0dad8af80906-kube-api-access-z8z8g\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.269335 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-frz8f"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.274602 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zq67\" (UniqueName: \"kubernetes.io/projected/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-kube-api-access-5zq67\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.274676 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-combined-ca-bundle\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.274710 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-db-sync-config-data\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.309363 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-4b4ms"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.310667 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.327658 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.328333 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-l2nzr" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.328485 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.353069 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.369571 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4b4ms"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.376898 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-scripts\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.376958 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zq67\" (UniqueName: \"kubernetes.io/projected/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-kube-api-access-5zq67\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.376988 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-combined-ca-bundle\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.377015 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-db-sync-config-data\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.377065 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-combined-ca-bundle\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.377092 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-logs\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.377109 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6lqc\" (UniqueName: \"kubernetes.io/projected/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-kube-api-access-m6lqc\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.377155 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-config-data\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.392525 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-2nfvt"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.401877 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.406461 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-combined-ca-bundle\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.424739 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-2nfvt"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.424892 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.444103 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.459345 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-db-sync-config-data\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.471423 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zq67\" (UniqueName: \"kubernetes.io/projected/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-kube-api-access-5zq67\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.479631 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-config-data\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.479750 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-scripts\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.479823 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-combined-ca-bundle\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.479849 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-logs\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.479873 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6lqc\" (UniqueName: \"kubernetes.io/projected/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-kube-api-access-m6lqc\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.486905 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-logs\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.487434 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.495706 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-combined-ca-bundle\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.497705 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-scripts\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.508809 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-config-data\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.520195 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6lqc\" (UniqueName: \"kubernetes.io/projected/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-kube-api-access-m6lqc\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.524416 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.596459 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.596536 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.596557 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp8jc\" (UniqueName: \"kubernetes.io/projected/ad047cd1-309a-401e-9fc6-cb1349614136-kube-api-access-qp8jc\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.596592 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.596672 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-config\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.596702 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-svc\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.693431 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.697406 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.698211 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.700114 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.704345 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp8jc\" (UniqueName: \"kubernetes.io/projected/ad047cd1-309a-401e-9fc6-cb1349614136-kube-api-access-qp8jc\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.704437 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.704688 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-config\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.704718 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-svc\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.704847 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.706458 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.706746 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-svc\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.707473 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-config\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.708057 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.710461 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.711002 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.711428 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-22dm7" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.711556 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.755261 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.759266 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.760155 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp8jc\" (UniqueName: \"kubernetes.io/projected/ad047cd1-309a-401e-9fc6-cb1349614136-kube-api-access-qp8jc\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.763205 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.785677 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.808804 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.809464 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.809537 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.809619 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.809650 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-logs\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.809675 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfc2c\" (UniqueName: \"kubernetes.io/projected/d7206357-ec52-4320-b659-a027694a74a9-kube-api-access-cfc2c\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.809702 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.862639 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911149 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911208 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911268 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911347 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911380 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911423 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911445 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-logs\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911471 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-logs\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911497 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfc2c\" (UniqueName: \"kubernetes.io/projected/d7206357-ec52-4320-b659-a027694a74a9-kube-api-access-cfc2c\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911529 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911558 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp2hm\" (UniqueName: \"kubernetes.io/projected/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-kube-api-access-hp2hm\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911586 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911618 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911656 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.912215 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.912878 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-logs\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.913199 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.925022 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.925736 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.944966 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfc2c\" (UniqueName: \"kubernetes.io/projected/d7206357-ec52-4320-b659-a027694a74a9-kube-api-access-cfc2c\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.945109 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.016312 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.018076 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.018160 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.018219 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.018934 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.019070 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.034626 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.034763 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-logs\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.034961 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp2hm\" (UniqueName: \"kubernetes.io/projected/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-kube-api-access-hp2hm\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.035196 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.036310 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-logs\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.044660 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.049384 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.049921 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.056692 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.057910 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp2hm\" (UniqueName: \"kubernetes.io/projected/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-kube-api-access-hp2hm\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.061374 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-frz8f"] Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.062205 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.084815 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.170843 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fvdtl"] Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.259312 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" event={"ID":"0177cd91-bf8f-4e82-9f8b-5c50118dee09","Type":"ContainerStarted","Data":"93628c673521ca0506bed73eea0c7faf2d887ad9a1a290cd554669daa35c3ce3"} Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.259280 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" podUID="a9624e97-8103-4296-b562-982cf05abfec" containerName="dnsmasq-dns" containerID="cri-o://ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f" gracePeriod=10 Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.365759 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fjmnw"] Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.412511 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4b4ms"] Mar 11 12:18:14 crc kubenswrapper[4816]: W0311 12:18:14.448707 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2772ef82_fe14_4f4d_8349_8ee515e39979.slice/crio-f11050b66cf18643ca807dd8a6fddbe1c30160c5ebaa861b516a6a0d311fa422 WatchSource:0}: Error finding container f11050b66cf18643ca807dd8a6fddbe1c30160c5ebaa861b516a6a0d311fa422: Status 404 returned error can't find the container with id f11050b66cf18643ca807dd8a6fddbe1c30160c5ebaa861b516a6a0d311fa422 Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.816332 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.826106 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-tdv64"] Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.842992 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rjxsf"] Mar 11 12:18:14 crc kubenswrapper[4816]: W0311 12:18:14.886529 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc643aa04_ce8d_4c3b_befc_ecdf63e35de8.slice/crio-1e6a18d4f0b251cb2f7727ad5be471c642eca99dd05bbcec781288abe852fcc2 WatchSource:0}: Error finding container 1e6a18d4f0b251cb2f7727ad5be471c642eca99dd05bbcec781288abe852fcc2: Status 404 returned error can't find the container with id 1e6a18d4f0b251cb2f7727ad5be471c642eca99dd05bbcec781288abe852fcc2 Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.023073 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.094316 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-svc\") pod \"a9624e97-8103-4296-b562-982cf05abfec\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.094432 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh48x\" (UniqueName: \"kubernetes.io/projected/a9624e97-8103-4296-b562-982cf05abfec-kube-api-access-zh48x\") pod \"a9624e97-8103-4296-b562-982cf05abfec\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.094491 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-sb\") pod \"a9624e97-8103-4296-b562-982cf05abfec\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.094609 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-config\") pod \"a9624e97-8103-4296-b562-982cf05abfec\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.094636 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-nb\") pod \"a9624e97-8103-4296-b562-982cf05abfec\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.094661 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-swift-storage-0\") pod \"a9624e97-8103-4296-b562-982cf05abfec\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.132808 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9624e97-8103-4296-b562-982cf05abfec-kube-api-access-zh48x" (OuterVolumeSpecName: "kube-api-access-zh48x") pod "a9624e97-8103-4296-b562-982cf05abfec" (UID: "a9624e97-8103-4296-b562-982cf05abfec"). InnerVolumeSpecName "kube-api-access-zh48x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.198474 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh48x\" (UniqueName: \"kubernetes.io/projected/a9624e97-8103-4296-b562-982cf05abfec-kube-api-access-zh48x\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.225070 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-2nfvt"] Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.292168 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fvdtl" event={"ID":"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85","Type":"ContainerStarted","Data":"0800101b998bc39fbc15280d1397d1d43d3447b5f82870b95f5c0e69e60ff601"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.292275 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fvdtl" event={"ID":"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85","Type":"ContainerStarted","Data":"6cce3dedf2f34ff6f1ab7cb71e5194415d6238bd65c1baa92383feb742726164"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.295200 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerStarted","Data":"8ef644ece8e49d46c6b3d18fe5c5f96913f607e9b6a202c08e5f7ee442c27c93"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.304448 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-config" (OuterVolumeSpecName: "config") pod "a9624e97-8103-4296-b562-982cf05abfec" (UID: "a9624e97-8103-4296-b562-982cf05abfec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.306735 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4b4ms" event={"ID":"f92c8acc-1a4a-4f28-a123-2f5b8b6905af","Type":"ContainerStarted","Data":"31e496272578b057f389702c22da6db4b04713d9b39444d9f2071398a63be537"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.325849 4816 generic.go:334] "Generic (PLEG): container finished" podID="a9624e97-8103-4296-b562-982cf05abfec" containerID="ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f" exitCode=0 Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.325894 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.325903 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" event={"ID":"a9624e97-8103-4296-b562-982cf05abfec","Type":"ContainerDied","Data":"ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.325969 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" event={"ID":"a9624e97-8103-4296-b562-982cf05abfec","Type":"ContainerDied","Data":"1722f447074997662412f081f41e66350a45168b3ef01991c199f5c589a81402"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.325993 4816 scope.go:117] "RemoveContainer" containerID="ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.327404 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rjxsf" event={"ID":"c643aa04-ce8d-4c3b-befc-ecdf63e35de8","Type":"ContainerStarted","Data":"1e6a18d4f0b251cb2f7727ad5be471c642eca99dd05bbcec781288abe852fcc2"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.336542 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fvdtl" podStartSLOduration=3.336508962 podStartE2EDuration="3.336508962s" podCreationTimestamp="2026-03-11 12:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:15.318321568 +0000 UTC m=+1181.909585535" watchObservedRunningTime="2026-03-11 12:18:15.336508962 +0000 UTC m=+1181.927772929" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.336836 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" event={"ID":"ad047cd1-309a-401e-9fc6-cb1349614136","Type":"ContainerStarted","Data":"7732a86e8dc12bafbe8cdaac586dd615d3b76e080ef096246aeda54dd0e49383"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.339226 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tdv64" event={"ID":"3ae20611-891b-49ee-b5b8-0dad8af80906","Type":"ContainerStarted","Data":"407febd9600a7f2ac248ee17af289c238ad46396cc3e57692031c09ed1d62622"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.340910 4816 generic.go:334] "Generic (PLEG): container finished" podID="0177cd91-bf8f-4e82-9f8b-5c50118dee09" containerID="2e5a3cf87af6703aca115bdf04591ed2e43c5e703b0abce69ed3bd4c9e44028a" exitCode=0 Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.341554 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" event={"ID":"0177cd91-bf8f-4e82-9f8b-5c50118dee09","Type":"ContainerDied","Data":"2e5a3cf87af6703aca115bdf04591ed2e43c5e703b0abce69ed3bd4c9e44028a"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.348926 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.355217 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fjmnw" event={"ID":"2772ef82-fe14-4f4d-8349-8ee515e39979","Type":"ContainerStarted","Data":"f11050b66cf18643ca807dd8a6fddbe1c30160c5ebaa861b516a6a0d311fa422"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.401802 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.438717 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a9624e97-8103-4296-b562-982cf05abfec" (UID: "a9624e97-8103-4296-b562-982cf05abfec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.439807 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9624e97-8103-4296-b562-982cf05abfec" (UID: "a9624e97-8103-4296-b562-982cf05abfec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.470861 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a9624e97-8103-4296-b562-982cf05abfec" (UID: "a9624e97-8103-4296-b562-982cf05abfec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.480977 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a9624e97-8103-4296-b562-982cf05abfec" (UID: "a9624e97-8103-4296-b562-982cf05abfec"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.503701 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.503756 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.503768 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.503777 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.515269 4816 scope.go:117] "RemoveContainer" containerID="d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.583760 4816 scope.go:117] "RemoveContainer" containerID="ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f" Mar 11 12:18:15 crc kubenswrapper[4816]: E0311 12:18:15.586511 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f\": container with ID starting with ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f not found: ID does not exist" containerID="ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.586573 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f"} err="failed to get container status \"ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f\": rpc error: code = NotFound desc = could not find container \"ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f\": container with ID starting with ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f not found: ID does not exist" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.586601 4816 scope.go:117] "RemoveContainer" containerID="d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85" Mar 11 12:18:15 crc kubenswrapper[4816]: E0311 12:18:15.599750 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85\": container with ID starting with d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85 not found: ID does not exist" containerID="d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.599803 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85"} err="failed to get container status \"d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85\": rpc error: code = NotFound desc = could not find container \"d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85\": container with ID starting with d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85 not found: ID does not exist" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.687268 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f88567fd9-qp995"] Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.696611 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f88567fd9-qp995"] Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.778193 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.781380 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.920343 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-nb\") pod \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.920420 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-svc\") pod \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.920547 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-config\") pod \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.920653 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtjdv\" (UniqueName: \"kubernetes.io/projected/0177cd91-bf8f-4e82-9f8b-5c50118dee09-kube-api-access-vtjdv\") pod \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.920726 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-sb\") pod \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.920840 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-swift-storage-0\") pod \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:15.993961 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0177cd91-bf8f-4e82-9f8b-5c50118dee09-kube-api-access-vtjdv" (OuterVolumeSpecName: "kube-api-access-vtjdv") pod "0177cd91-bf8f-4e82-9f8b-5c50118dee09" (UID: "0177cd91-bf8f-4e82-9f8b-5c50118dee09"). InnerVolumeSpecName "kube-api-access-vtjdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.039335 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0177cd91-bf8f-4e82-9f8b-5c50118dee09" (UID: "0177cd91-bf8f-4e82-9f8b-5c50118dee09"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.039971 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtjdv\" (UniqueName: \"kubernetes.io/projected/0177cd91-bf8f-4e82-9f8b-5c50118dee09-kube-api-access-vtjdv\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.040008 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.063291 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.070832 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0177cd91-bf8f-4e82-9f8b-5c50118dee09" (UID: "0177cd91-bf8f-4e82-9f8b-5c50118dee09"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.120588 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0177cd91-bf8f-4e82-9f8b-5c50118dee09" (UID: "0177cd91-bf8f-4e82-9f8b-5c50118dee09"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.126308 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-config" (OuterVolumeSpecName: "config") pod "0177cd91-bf8f-4e82-9f8b-5c50118dee09" (UID: "0177cd91-bf8f-4e82-9f8b-5c50118dee09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.143489 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.143805 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.144737 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.158058 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0177cd91-bf8f-4e82-9f8b-5c50118dee09" (UID: "0177cd91-bf8f-4e82-9f8b-5c50118dee09"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.248267 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.271316 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9624e97-8103-4296-b562-982cf05abfec" path="/var/lib/kubelet/pods/a9624e97-8103-4296-b562-982cf05abfec/volumes" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.272288 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.272328 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.415640 4816 generic.go:334] "Generic (PLEG): container finished" podID="ad047cd1-309a-401e-9fc6-cb1349614136" containerID="f842cab6fbb753d4036f93abfc735f41fd91ab93fdcba8e10b330c30d7aa8346" exitCode=0 Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.415748 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" event={"ID":"ad047cd1-309a-401e-9fc6-cb1349614136","Type":"ContainerDied","Data":"f842cab6fbb753d4036f93abfc735f41fd91ab93fdcba8e10b330c30d7aa8346"} Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.419199 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7206357-ec52-4320-b659-a027694a74a9","Type":"ContainerStarted","Data":"46c804665ae23bbf6170282e95342e08c9ca8c59a8646dc8a3bb729ae4357ed1"} Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.429018 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" event={"ID":"0177cd91-bf8f-4e82-9f8b-5c50118dee09","Type":"ContainerDied","Data":"93628c673521ca0506bed73eea0c7faf2d887ad9a1a290cd554669daa35c3ce3"} Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.429074 4816 scope.go:117] "RemoveContainer" containerID="2e5a3cf87af6703aca115bdf04591ed2e43c5e703b0abce69ed3bd4c9e44028a" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.429235 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.451824 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tdv64" event={"ID":"3ae20611-891b-49ee-b5b8-0dad8af80906","Type":"ContainerStarted","Data":"315146a94731475a01dc83fd91cffc1dc07e3b8364e3b5f9f4c74f1dffcbe0c4"} Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.456414 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1","Type":"ContainerStarted","Data":"e01256e648c1540249f5558a07356e2451e9fbb9af609837778e77bf7b2923ca"} Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.555709 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-frz8f"] Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.606932 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-tdv64" podStartSLOduration=4.606914278 podStartE2EDuration="4.606914278s" podCreationTimestamp="2026-03-11 12:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:16.54969433 +0000 UTC m=+1183.140958297" watchObservedRunningTime="2026-03-11 12:18:16.606914278 +0000 UTC m=+1183.198178245" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.607315 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-frz8f"] Mar 11 12:18:17 crc kubenswrapper[4816]: I0311 12:18:17.513522 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" event={"ID":"ad047cd1-309a-401e-9fc6-cb1349614136","Type":"ContainerStarted","Data":"ccafb95fbf3f12326123ae581a70f3b9eefd2d320c697240864a31290ea2a66c"} Mar 11 12:18:17 crc kubenswrapper[4816]: I0311 12:18:17.514031 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:17 crc kubenswrapper[4816]: I0311 12:18:17.522230 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7206357-ec52-4320-b659-a027694a74a9","Type":"ContainerStarted","Data":"c0c07679dd61d87d3bdfa74e26aa8259f34fe50f83fa1b2abc4b014796093496"} Mar 11 12:18:17 crc kubenswrapper[4816]: I0311 12:18:17.551831 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" podStartSLOduration=4.551794738 podStartE2EDuration="4.551794738s" podCreationTimestamp="2026-03-11 12:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:17.543000085 +0000 UTC m=+1184.134264052" watchObservedRunningTime="2026-03-11 12:18:17.551794738 +0000 UTC m=+1184.143058715" Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.147800 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0177cd91-bf8f-4e82-9f8b-5c50118dee09" path="/var/lib/kubelet/pods/0177cd91-bf8f-4e82-9f8b-5c50118dee09/volumes" Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.594526 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7206357-ec52-4320-b659-a027694a74a9","Type":"ContainerStarted","Data":"e5a70320bc6c70cd367bbbc4bdaabb37281153f62ac1cd7c0287d723080fe33d"} Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.594741 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-log" containerID="cri-o://c0c07679dd61d87d3bdfa74e26aa8259f34fe50f83fa1b2abc4b014796093496" gracePeriod=30 Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.595183 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-httpd" containerID="cri-o://e5a70320bc6c70cd367bbbc4bdaabb37281153f62ac1cd7c0287d723080fe33d" gracePeriod=30 Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.620582 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1","Type":"ContainerStarted","Data":"5bd79fbed29e67c023a52beb4ae91ed0ef93cd3f9f71c55834cba55ce617ed8d"} Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.620807 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1","Type":"ContainerStarted","Data":"391e53767cf17b2f58d4934d1239e1639cc3728d74480149f45e20668d71d7ee"} Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.621073 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-log" containerID="cri-o://391e53767cf17b2f58d4934d1239e1639cc3728d74480149f45e20668d71d7ee" gracePeriod=30 Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.621345 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-httpd" containerID="cri-o://5bd79fbed29e67c023a52beb4ae91ed0ef93cd3f9f71c55834cba55ce617ed8d" gracePeriod=30 Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.638636 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.6385993150000004 podStartE2EDuration="6.638599315s" podCreationTimestamp="2026-03-11 12:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:18.631967044 +0000 UTC m=+1185.223231011" watchObservedRunningTime="2026-03-11 12:18:18.638599315 +0000 UTC m=+1185.229863282" Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.681958 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.681923523 podStartE2EDuration="6.681923523s" podCreationTimestamp="2026-03-11 12:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:18.660724792 +0000 UTC m=+1185.251988759" watchObservedRunningTime="2026-03-11 12:18:18.681923523 +0000 UTC m=+1185.273187490" Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.657276 4816 generic.go:334] "Generic (PLEG): container finished" podID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerID="5bd79fbed29e67c023a52beb4ae91ed0ef93cd3f9f71c55834cba55ce617ed8d" exitCode=143 Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.657629 4816 generic.go:334] "Generic (PLEG): container finished" podID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerID="391e53767cf17b2f58d4934d1239e1639cc3728d74480149f45e20668d71d7ee" exitCode=143 Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.657312 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1","Type":"ContainerDied","Data":"5bd79fbed29e67c023a52beb4ae91ed0ef93cd3f9f71c55834cba55ce617ed8d"} Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.657764 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1","Type":"ContainerDied","Data":"391e53767cf17b2f58d4934d1239e1639cc3728d74480149f45e20668d71d7ee"} Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.662627 4816 generic.go:334] "Generic (PLEG): container finished" podID="d7206357-ec52-4320-b659-a027694a74a9" containerID="e5a70320bc6c70cd367bbbc4bdaabb37281153f62ac1cd7c0287d723080fe33d" exitCode=0 Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.662671 4816 generic.go:334] "Generic (PLEG): container finished" podID="d7206357-ec52-4320-b659-a027694a74a9" containerID="c0c07679dd61d87d3bdfa74e26aa8259f34fe50f83fa1b2abc4b014796093496" exitCode=143 Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.662696 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7206357-ec52-4320-b659-a027694a74a9","Type":"ContainerDied","Data":"e5a70320bc6c70cd367bbbc4bdaabb37281153f62ac1cd7c0287d723080fe33d"} Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.662722 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7206357-ec52-4320-b659-a027694a74a9","Type":"ContainerDied","Data":"c0c07679dd61d87d3bdfa74e26aa8259f34fe50f83fa1b2abc4b014796093496"} Mar 11 12:18:20 crc kubenswrapper[4816]: I0311 12:18:20.677108 4816 generic.go:334] "Generic (PLEG): container finished" podID="e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" containerID="0800101b998bc39fbc15280d1397d1d43d3447b5f82870b95f5c0e69e60ff601" exitCode=0 Mar 11 12:18:20 crc kubenswrapper[4816]: I0311 12:18:20.677188 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fvdtl" event={"ID":"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85","Type":"ContainerDied","Data":"0800101b998bc39fbc15280d1397d1d43d3447b5f82870b95f5c0e69e60ff601"} Mar 11 12:18:23 crc kubenswrapper[4816]: I0311 12:18:23.866528 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:23 crc kubenswrapper[4816]: I0311 12:18:23.955428 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67754df655-hhz24"] Mar 11 12:18:23 crc kubenswrapper[4816]: I0311 12:18:23.955743 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67754df655-hhz24" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" containerID="cri-o://e929e8e02a375fef6457bbafa642c02c68d821bc19103c5cffa50d761cc569e2" gracePeriod=10 Mar 11 12:18:25 crc kubenswrapper[4816]: I0311 12:18:25.101682 4816 generic.go:334] "Generic (PLEG): container finished" podID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerID="e929e8e02a375fef6457bbafa642c02c68d821bc19103c5cffa50d761cc569e2" exitCode=0 Mar 11 12:18:25 crc kubenswrapper[4816]: I0311 12:18:25.101765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-hhz24" event={"ID":"79c46d79-aa47-428c-abec-a6f94c66e9ab","Type":"ContainerDied","Data":"e929e8e02a375fef6457bbafa642c02c68d821bc19103c5cffa50d761cc569e2"} Mar 11 12:18:25 crc kubenswrapper[4816]: I0311 12:18:25.438958 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67754df655-hhz24" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Mar 11 12:18:35 crc kubenswrapper[4816]: I0311 12:18:35.665020 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67754df655-hhz24" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.452734 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.453567 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5n8zp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-fjmnw_openstack(2772ef82-fe14-4f4d-8349-8ee515e39979): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.455282 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-fjmnw" podUID="2772ef82-fe14-4f4d-8349-8ee515e39979" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.515156 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.515235 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.521398 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.658912 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.659871 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-scripts\") pod \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.660171 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-combined-ca-bundle\") pod \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.660382 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp2hm\" (UniqueName: \"kubernetes.io/projected/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-kube-api-access-hp2hm\") pod \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.660559 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-logs\") pod \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.660774 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-httpd-run\") pod \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.660960 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-config-data\") pod \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.661054 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-logs" (OuterVolumeSpecName: "logs") pod "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" (UID: "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.661323 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" (UID: "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.662591 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.662625 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.671701 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" (UID: "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.672231 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-scripts" (OuterVolumeSpecName: "scripts") pod "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" (UID: "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.680757 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-kube-api-access-hp2hm" (OuterVolumeSpecName: "kube-api-access-hp2hm") pod "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" (UID: "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1"). InnerVolumeSpecName "kube-api-access-hp2hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.701804 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" (UID: "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.715281 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-config-data" (OuterVolumeSpecName: "config-data") pod "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" (UID: "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.723205 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1","Type":"ContainerDied","Data":"e01256e648c1540249f5558a07356e2451e9fbb9af609837778e77bf7b2923ca"} Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.723332 4816 scope.go:117] "RemoveContainer" containerID="5bd79fbed29e67c023a52beb4ae91ed0ef93cd3f9f71c55834cba55ce617ed8d" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.723230 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.726864 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b\\\"\"" pod="openstack/cinder-db-sync-fjmnw" podUID="2772ef82-fe14-4f4d-8349-8ee515e39979" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.764980 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.765022 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.765034 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp2hm\" (UniqueName: \"kubernetes.io/projected/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-kube-api-access-hp2hm\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.765043 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.765071 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.789924 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.804459 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.819585 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.851939 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.852562 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-log" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.852586 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-log" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.852624 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0177cd91-bf8f-4e82-9f8b-5c50118dee09" containerName="init" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.852632 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0177cd91-bf8f-4e82-9f8b-5c50118dee09" containerName="init" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.852651 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9624e97-8103-4296-b562-982cf05abfec" containerName="init" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.852660 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9624e97-8103-4296-b562-982cf05abfec" containerName="init" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.852689 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9624e97-8103-4296-b562-982cf05abfec" containerName="dnsmasq-dns" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.852700 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9624e97-8103-4296-b562-982cf05abfec" containerName="dnsmasq-dns" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.852717 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-httpd" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.852729 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-httpd" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.852978 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-httpd" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.853001 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="0177cd91-bf8f-4e82-9f8b-5c50118dee09" containerName="init" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.853025 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-log" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.853039 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9624e97-8103-4296-b562-982cf05abfec" containerName="dnsmasq-dns" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.854471 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.866240 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.867639 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.869046 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.870845 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.973410 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tthz\" (UniqueName: \"kubernetes.io/projected/a9d3606c-b28d-4028-93fc-535afa127cd6-kube-api-access-2tthz\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.973772 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.973819 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.973865 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.973926 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.974003 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.974036 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-logs\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.974072 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076560 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076620 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tthz\" (UniqueName: \"kubernetes.io/projected/a9d3606c-b28d-4028-93fc-535afa127cd6-kube-api-access-2tthz\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076654 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076704 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076861 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076927 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076951 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076978 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-logs\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.077594 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.077805 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-logs\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.078056 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.083508 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.096627 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.096982 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.099373 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.105177 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tthz\" (UniqueName: \"kubernetes.io/projected/a9d3606c-b28d-4028-93fc-535afa127cd6-kube-api-access-2tthz\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.116288 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.162024 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" path="/var/lib/kubelet/pods/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1/volumes" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.194269 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.666523 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67754df655-hhz24" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.666886 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:18:41 crc kubenswrapper[4816]: E0311 12:18:41.447947 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:b8a5d052890fb9cefa333baf10b607add227ed5d79aa108b576a97b21e89327a" Mar 11 12:18:41 crc kubenswrapper[4816]: E0311 12:18:41.448202 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:b8a5d052890fb9cefa333baf10b607add227ed5d79aa108b576a97b21e89327a,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m6lqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-4b4ms_openstack(f92c8acc-1a4a-4f28-a123-2f5b8b6905af): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:18:41 crc kubenswrapper[4816]: E0311 12:18:41.450096 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-4b4ms" podUID="f92c8acc-1a4a-4f28-a123-2f5b8b6905af" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.522623 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.607770 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-config-data\") pod \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.607899 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-fernet-keys\") pod \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.607963 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-combined-ca-bundle\") pod \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.608055 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-scripts\") pod \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.608153 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44z2c\" (UniqueName: \"kubernetes.io/projected/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-kube-api-access-44z2c\") pod \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.608176 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-credential-keys\") pod \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.615350 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-scripts" (OuterVolumeSpecName: "scripts") pod "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" (UID: "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.616264 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" (UID: "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.616664 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-kube-api-access-44z2c" (OuterVolumeSpecName: "kube-api-access-44z2c") pod "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" (UID: "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85"). InnerVolumeSpecName "kube-api-access-44z2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.635813 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" (UID: "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.641073 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" (UID: "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.645576 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-config-data" (OuterVolumeSpecName: "config-data") pod "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" (UID: "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.710518 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.710555 4816 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.710565 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.710607 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.710618 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44z2c\" (UniqueName: \"kubernetes.io/projected/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-kube-api-access-44z2c\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.710628 4816 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.747226 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.747226 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fvdtl" event={"ID":"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85","Type":"ContainerDied","Data":"6cce3dedf2f34ff6f1ab7cb71e5194415d6238bd65c1baa92383feb742726164"} Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.747300 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cce3dedf2f34ff6f1ab7cb71e5194415d6238bd65c1baa92383feb742726164" Mar 11 12:18:41 crc kubenswrapper[4816]: E0311 12:18:41.750487 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:b8a5d052890fb9cefa333baf10b607add227ed5d79aa108b576a97b21e89327a\\\"\"" pod="openstack/placement-db-sync-4b4ms" podUID="f92c8acc-1a4a-4f28-a123-2f5b8b6905af" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.476756 4816 scope.go:117] "RemoveContainer" containerID="391e53767cf17b2f58d4934d1239e1639cc3728d74480149f45e20668d71d7ee" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.499443 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.499600 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5zq67,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-rjxsf_openstack(c643aa04-ce8d-4c3b-befc-ecdf63e35de8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.501376 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-rjxsf" podUID="c643aa04-ce8d-4c3b-befc-ecdf63e35de8" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.662887 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.682710 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.732914 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fvdtl"] Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.736765 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfc2c\" (UniqueName: \"kubernetes.io/projected/d7206357-ec52-4320-b659-a027694a74a9-kube-api-access-cfc2c\") pod \"d7206357-ec52-4320-b659-a027694a74a9\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.736918 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"d7206357-ec52-4320-b659-a027694a74a9\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.737119 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-httpd-run\") pod \"d7206357-ec52-4320-b659-a027694a74a9\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.737188 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-combined-ca-bundle\") pod \"d7206357-ec52-4320-b659-a027694a74a9\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.737358 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-logs\") pod \"d7206357-ec52-4320-b659-a027694a74a9\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.737408 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-scripts\") pod \"d7206357-ec52-4320-b659-a027694a74a9\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.737475 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-config-data\") pod \"d7206357-ec52-4320-b659-a027694a74a9\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.739930 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-logs" (OuterVolumeSpecName: "logs") pod "d7206357-ec52-4320-b659-a027694a74a9" (UID: "d7206357-ec52-4320-b659-a027694a74a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.743839 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fvdtl"] Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.745562 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d7206357-ec52-4320-b659-a027694a74a9" (UID: "d7206357-ec52-4320-b659-a027694a74a9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.755253 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-scripts" (OuterVolumeSpecName: "scripts") pod "d7206357-ec52-4320-b659-a027694a74a9" (UID: "d7206357-ec52-4320-b659-a027694a74a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.755524 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "d7206357-ec52-4320-b659-a027694a74a9" (UID: "d7206357-ec52-4320-b659-a027694a74a9"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.779936 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7206357-ec52-4320-b659-a027694a74a9-kube-api-access-cfc2c" (OuterVolumeSpecName: "kube-api-access-cfc2c") pod "d7206357-ec52-4320-b659-a027694a74a9" (UID: "d7206357-ec52-4320-b659-a027694a74a9"). InnerVolumeSpecName "kube-api-access-cfc2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.780620 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7206357-ec52-4320-b659-a027694a74a9" (UID: "d7206357-ec52-4320-b659-a027694a74a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.828955 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7206357-ec52-4320-b659-a027694a74a9","Type":"ContainerDied","Data":"46c804665ae23bbf6170282e95342e08c9ca8c59a8646dc8a3bb729ae4357ed1"} Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.829013 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.829062 4816 scope.go:117] "RemoveContainer" containerID="e5a70320bc6c70cd367bbbc4bdaabb37281153f62ac1cd7c0287d723080fe33d" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.843320 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-svc\") pod \"79c46d79-aa47-428c-abec-a6f94c66e9ab\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.843488 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-nb\") pod \"79c46d79-aa47-428c-abec-a6f94c66e9ab\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.843536 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-swift-storage-0\") pod \"79c46d79-aa47-428c-abec-a6f94c66e9ab\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.843606 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-sb\") pod \"79c46d79-aa47-428c-abec-a6f94c66e9ab\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.843674 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9csz\" (UniqueName: \"kubernetes.io/projected/79c46d79-aa47-428c-abec-a6f94c66e9ab-kube-api-access-g9csz\") pod \"79c46d79-aa47-428c-abec-a6f94c66e9ab\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.843735 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-config\") pod \"79c46d79-aa47-428c-abec-a6f94c66e9ab\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.844144 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.844162 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.844171 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfc2c\" (UniqueName: \"kubernetes.io/projected/d7206357-ec52-4320-b659-a027694a74a9-kube-api-access-cfc2c\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.844197 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.844207 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.844216 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.845933 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-w8rqc"] Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.846674 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="init" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.846698 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="init" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.846713 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.846721 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.846738 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-httpd" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.846745 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-httpd" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.846754 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" containerName="keystone-bootstrap" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.846762 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" containerName="keystone-bootstrap" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.846777 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-log" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.846784 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-log" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.847451 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-log" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.847477 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-httpd" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.847534 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.847554 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" containerName="keystone-bootstrap" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.848425 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.848695 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-hhz24" event={"ID":"79c46d79-aa47-428c-abec-a6f94c66e9ab","Type":"ContainerDied","Data":"81fe766129ca8c4e2e7c56bfe354248ee38599d3499418773992e5736bf81df2"} Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.848717 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.849201 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c46d79-aa47-428c-abec-a6f94c66e9ab-kube-api-access-g9csz" (OuterVolumeSpecName: "kube-api-access-g9csz") pod "79c46d79-aa47-428c-abec-a6f94c66e9ab" (UID: "79c46d79-aa47-428c-abec-a6f94c66e9ab"). InnerVolumeSpecName "kube-api-access-g9csz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.852282 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.853921 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.854035 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8k5jj" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.854061 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.854390 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.856797 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w8rqc"] Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.868035 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.871382 4816 scope.go:117] "RemoveContainer" containerID="c0c07679dd61d87d3bdfa74e26aa8259f34fe50f83fa1b2abc4b014796093496" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.871522 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a\\\"\"" pod="openstack/barbican-db-sync-rjxsf" podUID="c643aa04-ce8d-4c3b-befc-ecdf63e35de8" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.884146 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-config-data" (OuterVolumeSpecName: "config-data") pod "d7206357-ec52-4320-b659-a027694a74a9" (UID: "d7206357-ec52-4320-b659-a027694a74a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.911931 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "79c46d79-aa47-428c-abec-a6f94c66e9ab" (UID: "79c46d79-aa47-428c-abec-a6f94c66e9ab"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.915045 4816 scope.go:117] "RemoveContainer" containerID="e929e8e02a375fef6457bbafa642c02c68d821bc19103c5cffa50d761cc569e2" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.925979 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "79c46d79-aa47-428c-abec-a6f94c66e9ab" (UID: "79c46d79-aa47-428c-abec-a6f94c66e9ab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.926788 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "79c46d79-aa47-428c-abec-a6f94c66e9ab" (UID: "79c46d79-aa47-428c-abec-a6f94c66e9ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.932792 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-config" (OuterVolumeSpecName: "config") pod "79c46d79-aa47-428c-abec-a6f94c66e9ab" (UID: "79c46d79-aa47-428c-abec-a6f94c66e9ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.941985 4816 scope.go:117] "RemoveContainer" containerID="7cfd948e58ca0b33af11396daaf98403ef86af1a5fd0724d0ce0200e144ab4fe" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.946039 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "79c46d79-aa47-428c-abec-a6f94c66e9ab" (UID: "79c46d79-aa47-428c-abec-a6f94c66e9ab"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.946662 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-credential-keys\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.946737 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-scripts\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.946930 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-config-data\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.947136 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-combined-ca-bundle\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.947568 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdq6q\" (UniqueName: \"kubernetes.io/projected/09ce1ef6-fcd0-4182-afca-22c5892b48e2-kube-api-access-bdq6q\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.947722 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-fernet-keys\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.947975 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.948025 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.948040 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9csz\" (UniqueName: \"kubernetes.io/projected/79c46d79-aa47-428c-abec-a6f94c66e9ab-kube-api-access-g9csz\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.948053 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.948065 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.948077 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.948107 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.948118 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.050000 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdq6q\" (UniqueName: \"kubernetes.io/projected/09ce1ef6-fcd0-4182-afca-22c5892b48e2-kube-api-access-bdq6q\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.050089 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-fernet-keys\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.050170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-credential-keys\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.050238 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-scripts\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.050302 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-config-data\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.050347 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-combined-ca-bundle\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.054754 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-credential-keys\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.056411 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-scripts\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.057050 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-fernet-keys\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.057064 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-combined-ca-bundle\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.057668 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-config-data\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.073827 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdq6q\" (UniqueName: \"kubernetes.io/projected/09ce1ef6-fcd0-4182-afca-22c5892b48e2-kube-api-access-bdq6q\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.090549 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:43 crc kubenswrapper[4816]: W0311 12:18:43.095230 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9d3606c_b28d_4028_93fc_535afa127cd6.slice/crio-ccc820b0417c4ece231f5070aebe453d4f5f6552e1d188623620714789da98ed WatchSource:0}: Error finding container ccc820b0417c4ece231f5070aebe453d4f5f6552e1d188623620714789da98ed: Status 404 returned error can't find the container with id ccc820b0417c4ece231f5070aebe453d4f5f6552e1d188623620714789da98ed Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.182068 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.206901 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.268605 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.288037 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.290839 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.293167 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.296149 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.311097 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.334392 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67754df655-hhz24"] Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.342727 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67754df655-hhz24"] Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.399663 4816 scope.go:117] "RemoveContainer" containerID="258e8c83fc2dd9e9c165c147a3085d310c4de5d771038f237098b4b3a09178a8" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.475612 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.475694 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlzfp\" (UniqueName: \"kubernetes.io/projected/439b686e-927d-425a-a218-807220ae1e95-kube-api-access-mlzfp\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.475737 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.475771 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-config-data\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.476982 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.477156 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-logs\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.477576 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.477641 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-scripts\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.579712 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.579789 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-logs\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.579862 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.579879 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-scripts\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.579964 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.580015 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlzfp\" (UniqueName: \"kubernetes.io/projected/439b686e-927d-425a-a218-807220ae1e95-kube-api-access-mlzfp\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.580050 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.580079 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-config-data\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.580388 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.580434 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-logs\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.581010 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.587180 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.587581 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.588592 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-config-data\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.596141 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-scripts\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.601092 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlzfp\" (UniqueName: \"kubernetes.io/projected/439b686e-927d-425a-a218-807220ae1e95-kube-api-access-mlzfp\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.607957 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.613835 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: W0311 12:18:43.737259 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09ce1ef6_fcd0_4182_afca_22c5892b48e2.slice/crio-fb98e4336d682d875b7e94b2c10df2c35624f0d26db0436c3d5f3cac73ca09c5 WatchSource:0}: Error finding container fb98e4336d682d875b7e94b2c10df2c35624f0d26db0436c3d5f3cac73ca09c5: Status 404 returned error can't find the container with id fb98e4336d682d875b7e94b2c10df2c35624f0d26db0436c3d5f3cac73ca09c5 Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.751669 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w8rqc"] Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.864736 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w8rqc" event={"ID":"09ce1ef6-fcd0-4182-afca-22c5892b48e2","Type":"ContainerStarted","Data":"fb98e4336d682d875b7e94b2c10df2c35624f0d26db0436c3d5f3cac73ca09c5"} Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.869442 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9d3606c-b28d-4028-93fc-535afa127cd6","Type":"ContainerStarted","Data":"ccc820b0417c4ece231f5070aebe453d4f5f6552e1d188623620714789da98ed"} Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.876468 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerStarted","Data":"2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf"} Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.146846 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" path="/var/lib/kubelet/pods/79c46d79-aa47-428c-abec-a6f94c66e9ab/volumes" Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.147768 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7206357-ec52-4320-b659-a027694a74a9" path="/var/lib/kubelet/pods/d7206357-ec52-4320-b659-a027694a74a9/volumes" Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.148742 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" path="/var/lib/kubelet/pods/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85/volumes" Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.222936 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:44 crc kubenswrapper[4816]: W0311 12:18:44.606507 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod439b686e_927d_425a_a218_807220ae1e95.slice/crio-d6a32b27dcd7e08e03a755df62f2d58811a9c80acc32eb96770a7186a1ec069d WatchSource:0}: Error finding container d6a32b27dcd7e08e03a755df62f2d58811a9c80acc32eb96770a7186a1ec069d: Status 404 returned error can't find the container with id d6a32b27dcd7e08e03a755df62f2d58811a9c80acc32eb96770a7186a1ec069d Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.908277 4816 generic.go:334] "Generic (PLEG): container finished" podID="3ae20611-891b-49ee-b5b8-0dad8af80906" containerID="315146a94731475a01dc83fd91cffc1dc07e3b8364e3b5f9f4c74f1dffcbe0c4" exitCode=0 Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.908697 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tdv64" event={"ID":"3ae20611-891b-49ee-b5b8-0dad8af80906","Type":"ContainerDied","Data":"315146a94731475a01dc83fd91cffc1dc07e3b8364e3b5f9f4c74f1dffcbe0c4"} Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.927284 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"439b686e-927d-425a-a218-807220ae1e95","Type":"ContainerStarted","Data":"d6a32b27dcd7e08e03a755df62f2d58811a9c80acc32eb96770a7186a1ec069d"} Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.963973 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w8rqc" event={"ID":"09ce1ef6-fcd0-4182-afca-22c5892b48e2","Type":"ContainerStarted","Data":"619b2be9e8a3f61b134d163bc3ebb4105259f3d6eadad7ea8f76de2333bbeac4"} Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.968797 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9d3606c-b28d-4028-93fc-535afa127cd6","Type":"ContainerStarted","Data":"44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15"} Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.968843 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9d3606c-b28d-4028-93fc-535afa127cd6","Type":"ContainerStarted","Data":"d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b"} Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.988398 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-w8rqc" podStartSLOduration=2.9883773319999998 podStartE2EDuration="2.988377332s" podCreationTimestamp="2026-03-11 12:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:44.987247229 +0000 UTC m=+1211.578511186" watchObservedRunningTime="2026-03-11 12:18:44.988377332 +0000 UTC m=+1211.579641299" Mar 11 12:18:45 crc kubenswrapper[4816]: I0311 12:18:45.667680 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67754df655-hhz24" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.024146 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"439b686e-927d-425a-a218-807220ae1e95","Type":"ContainerStarted","Data":"4b5cec87927ba388b30feb742e4d193b529502bf6a8355ed2d02b5d41c560b67"} Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.030577 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerStarted","Data":"785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde"} Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.389822 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.409789 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.409770955 podStartE2EDuration="7.409770955s" podCreationTimestamp="2026-03-11 12:18:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:45.022572336 +0000 UTC m=+1211.613836303" watchObservedRunningTime="2026-03-11 12:18:46.409770955 +0000 UTC m=+1213.001034922" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.576958 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-config\") pod \"3ae20611-891b-49ee-b5b8-0dad8af80906\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.577063 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-combined-ca-bundle\") pod \"3ae20611-891b-49ee-b5b8-0dad8af80906\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.577132 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8z8g\" (UniqueName: \"kubernetes.io/projected/3ae20611-891b-49ee-b5b8-0dad8af80906-kube-api-access-z8z8g\") pod \"3ae20611-891b-49ee-b5b8-0dad8af80906\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.586304 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae20611-891b-49ee-b5b8-0dad8af80906-kube-api-access-z8z8g" (OuterVolumeSpecName: "kube-api-access-z8z8g") pod "3ae20611-891b-49ee-b5b8-0dad8af80906" (UID: "3ae20611-891b-49ee-b5b8-0dad8af80906"). InnerVolumeSpecName "kube-api-access-z8z8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.608789 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ae20611-891b-49ee-b5b8-0dad8af80906" (UID: "3ae20611-891b-49ee-b5b8-0dad8af80906"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.622982 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-config" (OuterVolumeSpecName: "config") pod "3ae20611-891b-49ee-b5b8-0dad8af80906" (UID: "3ae20611-891b-49ee-b5b8-0dad8af80906"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.680140 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.680192 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.680209 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8z8g\" (UniqueName: \"kubernetes.io/projected/3ae20611-891b-49ee-b5b8-0dad8af80906-kube-api-access-z8z8g\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.042548 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"439b686e-927d-425a-a218-807220ae1e95","Type":"ContainerStarted","Data":"ecaa5276e0e1e71d262bf64a26711871fc4a429158857be7af8e6465f4bd05ea"} Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.047033 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tdv64" event={"ID":"3ae20611-891b-49ee-b5b8-0dad8af80906","Type":"ContainerDied","Data":"407febd9600a7f2ac248ee17af289c238ad46396cc3e57692031c09ed1d62622"} Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.047082 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="407febd9600a7f2ac248ee17af289c238ad46396cc3e57692031c09ed1d62622" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.047085 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.077920 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.077899755 podStartE2EDuration="4.077899755s" podCreationTimestamp="2026-03-11 12:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:47.076735401 +0000 UTC m=+1213.667999388" watchObservedRunningTime="2026-03-11 12:18:47.077899755 +0000 UTC m=+1213.669163722" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.189190 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-2w84f"] Mar 11 12:18:47 crc kubenswrapper[4816]: E0311 12:18:47.189624 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae20611-891b-49ee-b5b8-0dad8af80906" containerName="neutron-db-sync" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.189642 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae20611-891b-49ee-b5b8-0dad8af80906" containerName="neutron-db-sync" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.189810 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae20611-891b-49ee-b5b8-0dad8af80906" containerName="neutron-db-sync" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.190757 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.208852 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-2w84f"] Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.295551 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9df8757bb-rzb52"] Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.300820 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.301728 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-sb\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.301994 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-svc\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.302237 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-config\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.302348 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqnqd\" (UniqueName: \"kubernetes.io/projected/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-kube-api-access-mqnqd\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.302380 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-swift-storage-0\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.302456 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-nb\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.303648 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.303915 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-48x47" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.304049 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.320873 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.371956 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9df8757bb-rzb52"] Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.404696 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-svc\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.404769 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-config\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.404849 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9clks\" (UniqueName: \"kubernetes.io/projected/68498f16-b5c3-4960-8565-7ae628fc3122-kube-api-access-9clks\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.404872 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-combined-ca-bundle\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.404964 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-httpd-config\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.405025 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-config\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.405057 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqnqd\" (UniqueName: \"kubernetes.io/projected/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-kube-api-access-mqnqd\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.406103 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-config\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.406169 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-swift-storage-0\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.406215 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-nb\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.406234 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-svc\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.406264 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-ovndb-tls-certs\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.406973 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-sb\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.406991 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-nb\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.407064 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-swift-storage-0\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.408223 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-sb\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.444574 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqnqd\" (UniqueName: \"kubernetes.io/projected/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-kube-api-access-mqnqd\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.508736 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9clks\" (UniqueName: \"kubernetes.io/projected/68498f16-b5c3-4960-8565-7ae628fc3122-kube-api-access-9clks\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.508789 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-combined-ca-bundle\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.508823 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-httpd-config\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.508883 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-ovndb-tls-certs\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.508954 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-config\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.514043 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-config\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.515982 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-httpd-config\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.516023 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-combined-ca-bundle\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.520583 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.526809 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9clks\" (UniqueName: \"kubernetes.io/projected/68498f16-b5c3-4960-8565-7ae628fc3122-kube-api-access-9clks\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.533146 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-ovndb-tls-certs\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.645098 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:48 crc kubenswrapper[4816]: I0311 12:18:48.069407 4816 generic.go:334] "Generic (PLEG): container finished" podID="09ce1ef6-fcd0-4182-afca-22c5892b48e2" containerID="619b2be9e8a3f61b134d163bc3ebb4105259f3d6eadad7ea8f76de2333bbeac4" exitCode=0 Mar 11 12:18:48 crc kubenswrapper[4816]: I0311 12:18:48.070632 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w8rqc" event={"ID":"09ce1ef6-fcd0-4182-afca-22c5892b48e2","Type":"ContainerDied","Data":"619b2be9e8a3f61b134d163bc3ebb4105259f3d6eadad7ea8f76de2333bbeac4"} Mar 11 12:18:48 crc kubenswrapper[4816]: I0311 12:18:48.081547 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-2w84f"] Mar 11 12:18:48 crc kubenswrapper[4816]: W0311 12:18:48.422005 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68498f16_b5c3_4960_8565_7ae628fc3122.slice/crio-ca48397e5444728848156fabb2c1b9060ca19d57a1c1905996ce53cd9a54fc09 WatchSource:0}: Error finding container ca48397e5444728848156fabb2c1b9060ca19d57a1c1905996ce53cd9a54fc09: Status 404 returned error can't find the container with id ca48397e5444728848156fabb2c1b9060ca19d57a1c1905996ce53cd9a54fc09 Mar 11 12:18:48 crc kubenswrapper[4816]: I0311 12:18:48.426970 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9df8757bb-rzb52"] Mar 11 12:18:49 crc kubenswrapper[4816]: I0311 12:18:49.113123 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9df8757bb-rzb52" event={"ID":"68498f16-b5c3-4960-8565-7ae628fc3122","Type":"ContainerStarted","Data":"385c6a6a7483bf3ffb2a31553a973012c1161303ce29917595a5f314788786f7"} Mar 11 12:18:49 crc kubenswrapper[4816]: I0311 12:18:49.113635 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9df8757bb-rzb52" event={"ID":"68498f16-b5c3-4960-8565-7ae628fc3122","Type":"ContainerStarted","Data":"ca48397e5444728848156fabb2c1b9060ca19d57a1c1905996ce53cd9a54fc09"} Mar 11 12:18:49 crc kubenswrapper[4816]: I0311 12:18:49.119464 4816 generic.go:334] "Generic (PLEG): container finished" podID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerID="f365943c3bcd25f6e7decbae194b1841bae20d3a5ca1816848e3f40bb39b1c41" exitCode=0 Mar 11 12:18:49 crc kubenswrapper[4816]: I0311 12:18:49.121924 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" event={"ID":"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c","Type":"ContainerDied","Data":"f365943c3bcd25f6e7decbae194b1841bae20d3a5ca1816848e3f40bb39b1c41"} Mar 11 12:18:49 crc kubenswrapper[4816]: I0311 12:18:49.121976 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" event={"ID":"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c","Type":"ContainerStarted","Data":"88fbfecb363955a5809ac97d8f060b762763c43823be00b756985a39bffcbe7e"} Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.161086 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-64584d7649-mb6k8"] Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.163241 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.170932 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.175329 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.178046 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-64584d7649-mb6k8"] Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.195948 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.196189 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.235122 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.255342 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.282561 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-internal-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.282631 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-httpd-config\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.282686 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-combined-ca-bundle\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.282738 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-config\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.282764 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-ovndb-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.282804 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6v7x\" (UniqueName: \"kubernetes.io/projected/bd930e1b-a508-4a64-8825-9800b8010d59-kube-api-access-w6v7x\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.282838 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-public-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.384866 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-public-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.384987 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-internal-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.385030 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-httpd-config\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.385111 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-combined-ca-bundle\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.385206 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-config\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.385238 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-ovndb-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.385311 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6v7x\" (UniqueName: \"kubernetes.io/projected/bd930e1b-a508-4a64-8825-9800b8010d59-kube-api-access-w6v7x\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.393436 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-public-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.394212 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-combined-ca-bundle\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.394450 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-httpd-config\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.395064 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-ovndb-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.396054 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-config\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.405967 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-internal-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.409177 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6v7x\" (UniqueName: \"kubernetes.io/projected/bd930e1b-a508-4a64-8825-9800b8010d59-kube-api-access-w6v7x\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.488898 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:51 crc kubenswrapper[4816]: I0311 12:18:51.156425 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:51 crc kubenswrapper[4816]: I0311 12:18:51.157094 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.137651 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.259228 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-combined-ca-bundle\") pod \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.259718 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-fernet-keys\") pod \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.259777 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-credential-keys\") pod \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.259802 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-scripts\") pod \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.259962 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdq6q\" (UniqueName: \"kubernetes.io/projected/09ce1ef6-fcd0-4182-afca-22c5892b48e2-kube-api-access-bdq6q\") pod \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.260029 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-config-data\") pod \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.265003 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w8rqc" event={"ID":"09ce1ef6-fcd0-4182-afca-22c5892b48e2","Type":"ContainerDied","Data":"fb98e4336d682d875b7e94b2c10df2c35624f0d26db0436c3d5f3cac73ca09c5"} Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.265069 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb98e4336d682d875b7e94b2c10df2c35624f0d26db0436c3d5f3cac73ca09c5" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.265176 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.278228 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-scripts" (OuterVolumeSpecName: "scripts") pod "09ce1ef6-fcd0-4182-afca-22c5892b48e2" (UID: "09ce1ef6-fcd0-4182-afca-22c5892b48e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.281533 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "09ce1ef6-fcd0-4182-afca-22c5892b48e2" (UID: "09ce1ef6-fcd0-4182-afca-22c5892b48e2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.282568 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ce1ef6-fcd0-4182-afca-22c5892b48e2-kube-api-access-bdq6q" (OuterVolumeSpecName: "kube-api-access-bdq6q") pod "09ce1ef6-fcd0-4182-afca-22c5892b48e2" (UID: "09ce1ef6-fcd0-4182-afca-22c5892b48e2"). InnerVolumeSpecName "kube-api-access-bdq6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.294111 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "09ce1ef6-fcd0-4182-afca-22c5892b48e2" (UID: "09ce1ef6-fcd0-4182-afca-22c5892b48e2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.358421 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-config-data" (OuterVolumeSpecName: "config-data") pod "09ce1ef6-fcd0-4182-afca-22c5892b48e2" (UID: "09ce1ef6-fcd0-4182-afca-22c5892b48e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.363112 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdq6q\" (UniqueName: \"kubernetes.io/projected/09ce1ef6-fcd0-4182-afca-22c5892b48e2-kube-api-access-bdq6q\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.363149 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.363166 4816 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.363178 4816 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.363191 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.370335 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09ce1ef6-fcd0-4182-afca-22c5892b48e2" (UID: "09ce1ef6-fcd0-4182-afca-22c5892b48e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.465552 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.614704 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.614799 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.649385 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.671203 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.702192 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.702353 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.716449 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:53 crc kubenswrapper[4816]: W0311 12:18:53.759935 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd930e1b_a508_4a64_8825_9800b8010d59.slice/crio-0829aed87a841d8c87b4f741cc407293d8e591d9e8b4c02e21e8a61c30445d1f WatchSource:0}: Error finding container 0829aed87a841d8c87b4f741cc407293d8e591d9e8b4c02e21e8a61c30445d1f: Status 404 returned error can't find the container with id 0829aed87a841d8c87b4f741cc407293d8e591d9e8b4c02e21e8a61c30445d1f Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.782473 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-64584d7649-mb6k8"] Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.283733 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64584d7649-mb6k8" event={"ID":"bd930e1b-a508-4a64-8825-9800b8010d59","Type":"ContainerStarted","Data":"06ebd4a2da9305c5f9303396efc2a80f0ef4ae2462b9e8b47545883c85f3c658"} Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.284172 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64584d7649-mb6k8" event={"ID":"bd930e1b-a508-4a64-8825-9800b8010d59","Type":"ContainerStarted","Data":"0829aed87a841d8c87b4f741cc407293d8e591d9e8b4c02e21e8a61c30445d1f"} Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.310764 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9df8757bb-rzb52" event={"ID":"68498f16-b5c3-4960-8565-7ae628fc3122","Type":"ContainerStarted","Data":"3a4b8f5199cb2db96176f7d26ac1288036fcf9dd3deb012c7c6cb2bd6febc6c2"} Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.311162 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.313668 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5d6ddcd789-qjf9c"] Mar 11 12:18:54 crc kubenswrapper[4816]: E0311 12:18:54.316036 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ce1ef6-fcd0-4182-afca-22c5892b48e2" containerName="keystone-bootstrap" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.316069 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ce1ef6-fcd0-4182-afca-22c5892b48e2" containerName="keystone-bootstrap" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.316318 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ce1ef6-fcd0-4182-afca-22c5892b48e2" containerName="keystone-bootstrap" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.317269 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.323922 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" event={"ID":"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c","Type":"ContainerStarted","Data":"ec1bad5250db6f7f6e52f756f22dc65681cf048e7705d228c0b2aebf5f68f5e9"} Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.324155 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.324375 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.324496 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.324512 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.324381 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.324772 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.325034 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8k5jj" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.348056 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d6ddcd789-qjf9c"] Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.408563 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4b4ms" event={"ID":"f92c8acc-1a4a-4f28-a123-2f5b8b6905af","Type":"ContainerStarted","Data":"1f2178fe24813df8bfcc542c32d18ec7c0d7ab550dc406623e692f0465cd6535"} Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.412058 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-scripts\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.412161 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-internal-tls-certs\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.412223 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-public-tls-certs\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.412357 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-combined-ca-bundle\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.412625 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zc8j\" (UniqueName: \"kubernetes.io/projected/9c180505-72c6-498d-bfa5-05f689692bd2-kube-api-access-6zc8j\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.412785 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-config-data\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.413175 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-fernet-keys\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.413211 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-credential-keys\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.434745 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerStarted","Data":"fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0"} Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.437064 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.437106 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.478609 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-9df8757bb-rzb52" podStartSLOduration=7.478574828 podStartE2EDuration="7.478574828s" podCreationTimestamp="2026-03-11 12:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:54.338752152 +0000 UTC m=+1220.930016119" watchObservedRunningTime="2026-03-11 12:18:54.478574828 +0000 UTC m=+1221.069838795" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.518617 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-credential-keys\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.518828 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-scripts\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.518880 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-internal-tls-certs\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.518908 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-public-tls-certs\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.518930 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-combined-ca-bundle\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.518961 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zc8j\" (UniqueName: \"kubernetes.io/projected/9c180505-72c6-498d-bfa5-05f689692bd2-kube-api-access-6zc8j\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.519023 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-config-data\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.519139 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-fernet-keys\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.535210 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-credential-keys\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.542902 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-combined-ca-bundle\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.543222 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-internal-tls-certs\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.544917 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-config-data\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.552817 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-public-tls-certs\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.566290 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-scripts\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.577395 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" podStartSLOduration=7.5773635729999995 podStartE2EDuration="7.577363573s" podCreationTimestamp="2026-03-11 12:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:54.477871548 +0000 UTC m=+1221.069135505" watchObservedRunningTime="2026-03-11 12:18:54.577363573 +0000 UTC m=+1221.168627540" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.592974 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zc8j\" (UniqueName: \"kubernetes.io/projected/9c180505-72c6-498d-bfa5-05f689692bd2-kube-api-access-6zc8j\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.594388 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-fernet-keys\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.643916 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-4b4ms" podStartSLOduration=2.4001057230000002 podStartE2EDuration="41.643891649s" podCreationTimestamp="2026-03-11 12:18:13 +0000 UTC" firstStartedPulling="2026-03-11 12:18:14.468391173 +0000 UTC m=+1181.059655140" lastFinishedPulling="2026-03-11 12:18:53.712177089 +0000 UTC m=+1220.303441066" observedRunningTime="2026-03-11 12:18:54.57169064 +0000 UTC m=+1221.162954607" watchObservedRunningTime="2026-03-11 12:18:54.643891649 +0000 UTC m=+1221.235155616" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.751695 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:55 crc kubenswrapper[4816]: I0311 12:18:55.402566 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d6ddcd789-qjf9c"] Mar 11 12:18:55 crc kubenswrapper[4816]: I0311 12:18:55.461371 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64584d7649-mb6k8" event={"ID":"bd930e1b-a508-4a64-8825-9800b8010d59","Type":"ContainerStarted","Data":"ed06a5d04ea24da7b7022266f3b93adfbbc7a80293e5752545ee9f6add12458d"} Mar 11 12:18:55 crc kubenswrapper[4816]: I0311 12:18:55.461770 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:55 crc kubenswrapper[4816]: I0311 12:18:55.465092 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d6ddcd789-qjf9c" event={"ID":"9c180505-72c6-498d-bfa5-05f689692bd2","Type":"ContainerStarted","Data":"210d5da4467eeb407cc3db147ba87bbb3dfcf68d3ca56b768383a1d9ec2cdc8a"} Mar 11 12:18:55 crc kubenswrapper[4816]: I0311 12:18:55.475534 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fjmnw" event={"ID":"2772ef82-fe14-4f4d-8349-8ee515e39979","Type":"ContainerStarted","Data":"c3956854978860cbc650270e665106bd8e95400d5b8cce00a86ed500eb262922"} Mar 11 12:18:55 crc kubenswrapper[4816]: I0311 12:18:55.501712 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-64584d7649-mb6k8" podStartSLOduration=5.501690482 podStartE2EDuration="5.501690482s" podCreationTimestamp="2026-03-11 12:18:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:55.492553319 +0000 UTC m=+1222.083817286" watchObservedRunningTime="2026-03-11 12:18:55.501690482 +0000 UTC m=+1222.092954449" Mar 11 12:18:55 crc kubenswrapper[4816]: I0311 12:18:55.552470 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-fjmnw" podStartSLOduration=4.303717687 podStartE2EDuration="43.552431964s" podCreationTimestamp="2026-03-11 12:18:12 +0000 UTC" firstStartedPulling="2026-03-11 12:18:14.461362031 +0000 UTC m=+1181.052625998" lastFinishedPulling="2026-03-11 12:18:53.710076308 +0000 UTC m=+1220.301340275" observedRunningTime="2026-03-11 12:18:55.513380349 +0000 UTC m=+1222.104644326" watchObservedRunningTime="2026-03-11 12:18:55.552431964 +0000 UTC m=+1222.143695931" Mar 11 12:18:56 crc kubenswrapper[4816]: I0311 12:18:56.488811 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d6ddcd789-qjf9c" event={"ID":"9c180505-72c6-498d-bfa5-05f689692bd2","Type":"ContainerStarted","Data":"40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8"} Mar 11 12:18:56 crc kubenswrapper[4816]: I0311 12:18:56.489316 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:56 crc kubenswrapper[4816]: I0311 12:18:56.488846 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 12:18:56 crc kubenswrapper[4816]: I0311 12:18:56.489356 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 12:18:57 crc kubenswrapper[4816]: I0311 12:18:57.397981 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 12:18:57 crc kubenswrapper[4816]: I0311 12:18:57.402272 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 12:18:57 crc kubenswrapper[4816]: I0311 12:18:57.421195 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5d6ddcd789-qjf9c" podStartSLOduration=3.42117205 podStartE2EDuration="3.42117205s" podCreationTimestamp="2026-03-11 12:18:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:56.515419336 +0000 UTC m=+1223.106683303" watchObservedRunningTime="2026-03-11 12:18:57.42117205 +0000 UTC m=+1224.012436017" Mar 11 12:18:57 crc kubenswrapper[4816]: I0311 12:18:57.508644 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rjxsf" event={"ID":"c643aa04-ce8d-4c3b-befc-ecdf63e35de8","Type":"ContainerStarted","Data":"c704df83b8c052d797cc33017726ade79e749840ea39268bdd3404b42194d40d"} Mar 11 12:18:57 crc kubenswrapper[4816]: I0311 12:18:57.523394 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-rjxsf" podStartSLOduration=3.688778427 podStartE2EDuration="45.523370793s" podCreationTimestamp="2026-03-11 12:18:12 +0000 UTC" firstStartedPulling="2026-03-11 12:18:14.892076393 +0000 UTC m=+1181.483340360" lastFinishedPulling="2026-03-11 12:18:56.726668759 +0000 UTC m=+1223.317932726" observedRunningTime="2026-03-11 12:18:57.521733526 +0000 UTC m=+1224.112997493" watchObservedRunningTime="2026-03-11 12:18:57.523370793 +0000 UTC m=+1224.114634760" Mar 11 12:18:58 crc kubenswrapper[4816]: I0311 12:18:58.524349 4816 generic.go:334] "Generic (PLEG): container finished" podID="f92c8acc-1a4a-4f28-a123-2f5b8b6905af" containerID="1f2178fe24813df8bfcc542c32d18ec7c0d7ab550dc406623e692f0465cd6535" exitCode=0 Mar 11 12:18:58 crc kubenswrapper[4816]: I0311 12:18:58.524458 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4b4ms" event={"ID":"f92c8acc-1a4a-4f28-a123-2f5b8b6905af","Type":"ContainerDied","Data":"1f2178fe24813df8bfcc542c32d18ec7c0d7ab550dc406623e692f0465cd6535"} Mar 11 12:19:01 crc kubenswrapper[4816]: I0311 12:19:01.554968 4816 generic.go:334] "Generic (PLEG): container finished" podID="c643aa04-ce8d-4c3b-befc-ecdf63e35de8" containerID="c704df83b8c052d797cc33017726ade79e749840ea39268bdd3404b42194d40d" exitCode=0 Mar 11 12:19:01 crc kubenswrapper[4816]: I0311 12:19:01.555094 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rjxsf" event={"ID":"c643aa04-ce8d-4c3b-befc-ecdf63e35de8","Type":"ContainerDied","Data":"c704df83b8c052d797cc33017726ade79e749840ea39268bdd3404b42194d40d"} Mar 11 12:19:02 crc kubenswrapper[4816]: I0311 12:19:02.523428 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:19:02 crc kubenswrapper[4816]: I0311 12:19:02.601654 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-2nfvt"] Mar 11 12:19:02 crc kubenswrapper[4816]: I0311 12:19:02.602376 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" containerName="dnsmasq-dns" containerID="cri-o://ccafb95fbf3f12326123ae581a70f3b9eefd2d320c697240864a31290ea2a66c" gracePeriod=10 Mar 11 12:19:02 crc kubenswrapper[4816]: I0311 12:19:02.605558 4816 generic.go:334] "Generic (PLEG): container finished" podID="2772ef82-fe14-4f4d-8349-8ee515e39979" containerID="c3956854978860cbc650270e665106bd8e95400d5b8cce00a86ed500eb262922" exitCode=0 Mar 11 12:19:02 crc kubenswrapper[4816]: I0311 12:19:02.605743 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fjmnw" event={"ID":"2772ef82-fe14-4f4d-8349-8ee515e39979","Type":"ContainerDied","Data":"c3956854978860cbc650270e665106bd8e95400d5b8cce00a86ed500eb262922"} Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.062344 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.069326 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4b4ms" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248011 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-db-sync-config-data\") pod \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248116 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-logs\") pod \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248230 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-combined-ca-bundle\") pod \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248276 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-combined-ca-bundle\") pod \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248362 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zq67\" (UniqueName: \"kubernetes.io/projected/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-kube-api-access-5zq67\") pod \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248410 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-config-data\") pod \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248431 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-scripts\") pod \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248474 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6lqc\" (UniqueName: \"kubernetes.io/projected/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-kube-api-access-m6lqc\") pod \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.262950 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-kube-api-access-5zq67" (OuterVolumeSpecName: "kube-api-access-5zq67") pod "c643aa04-ce8d-4c3b-befc-ecdf63e35de8" (UID: "c643aa04-ce8d-4c3b-befc-ecdf63e35de8"). InnerVolumeSpecName "kube-api-access-5zq67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.263073 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c643aa04-ce8d-4c3b-befc-ecdf63e35de8" (UID: "c643aa04-ce8d-4c3b-befc-ecdf63e35de8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.264445 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-kube-api-access-m6lqc" (OuterVolumeSpecName: "kube-api-access-m6lqc") pod "f92c8acc-1a4a-4f28-a123-2f5b8b6905af" (UID: "f92c8acc-1a4a-4f28-a123-2f5b8b6905af"). InnerVolumeSpecName "kube-api-access-m6lqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.264700 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-logs" (OuterVolumeSpecName: "logs") pod "f92c8acc-1a4a-4f28-a123-2f5b8b6905af" (UID: "f92c8acc-1a4a-4f28-a123-2f5b8b6905af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.288121 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-scripts" (OuterVolumeSpecName: "scripts") pod "f92c8acc-1a4a-4f28-a123-2f5b8b6905af" (UID: "f92c8acc-1a4a-4f28-a123-2f5b8b6905af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.350556 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zq67\" (UniqueName: \"kubernetes.io/projected/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-kube-api-access-5zq67\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.350616 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.350630 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6lqc\" (UniqueName: \"kubernetes.io/projected/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-kube-api-access-m6lqc\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.350642 4816 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.350654 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.374422 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-config-data" (OuterVolumeSpecName: "config-data") pod "f92c8acc-1a4a-4f28-a123-2f5b8b6905af" (UID: "f92c8acc-1a4a-4f28-a123-2f5b8b6905af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.389531 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c643aa04-ce8d-4c3b-befc-ecdf63e35de8" (UID: "c643aa04-ce8d-4c3b-befc-ecdf63e35de8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.404513 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f92c8acc-1a4a-4f28-a123-2f5b8b6905af" (UID: "f92c8acc-1a4a-4f28-a123-2f5b8b6905af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.454458 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.454498 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.454509 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.621703 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rjxsf" event={"ID":"c643aa04-ce8d-4c3b-befc-ecdf63e35de8","Type":"ContainerDied","Data":"1e6a18d4f0b251cb2f7727ad5be471c642eca99dd05bbcec781288abe852fcc2"} Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.621749 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e6a18d4f0b251cb2f7727ad5be471c642eca99dd05bbcec781288abe852fcc2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.621823 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.626632 4816 generic.go:334] "Generic (PLEG): container finished" podID="ad047cd1-309a-401e-9fc6-cb1349614136" containerID="ccafb95fbf3f12326123ae581a70f3b9eefd2d320c697240864a31290ea2a66c" exitCode=0 Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.626732 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" event={"ID":"ad047cd1-309a-401e-9fc6-cb1349614136","Type":"ContainerDied","Data":"ccafb95fbf3f12326123ae581a70f3b9eefd2d320c697240864a31290ea2a66c"} Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.628753 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4b4ms" event={"ID":"f92c8acc-1a4a-4f28-a123-2f5b8b6905af","Type":"ContainerDied","Data":"31e496272578b057f389702c22da6db4b04713d9b39444d9f2071398a63be537"} Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.628795 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4b4ms" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.628798 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31e496272578b057f389702c22da6db4b04713d9b39444d9f2071398a63be537" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.801163 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-855897fd55-t7sfb"] Mar 11 12:19:03 crc kubenswrapper[4816]: E0311 12:19:03.802072 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f92c8acc-1a4a-4f28-a123-2f5b8b6905af" containerName="placement-db-sync" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.802089 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92c8acc-1a4a-4f28-a123-2f5b8b6905af" containerName="placement-db-sync" Mar 11 12:19:03 crc kubenswrapper[4816]: E0311 12:19:03.802121 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c643aa04-ce8d-4c3b-befc-ecdf63e35de8" containerName="barbican-db-sync" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.802127 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c643aa04-ce8d-4c3b-befc-ecdf63e35de8" containerName="barbican-db-sync" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.802333 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c643aa04-ce8d-4c3b-befc-ecdf63e35de8" containerName="barbican-db-sync" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.802351 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f92c8acc-1a4a-4f28-a123-2f5b8b6905af" containerName="placement-db-sync" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.803382 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.806857 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.807216 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.808038 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fxmtd" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.811899 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-59b4f4d478-5b797"] Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.813677 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.821605 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.833761 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-855897fd55-t7sfb"] Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.865524 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-logs\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.865609 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl4t9\" (UniqueName: \"kubernetes.io/projected/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-kube-api-access-tl4t9\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.865649 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-combined-ca-bundle\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.865689 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.865812 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data-custom\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.873944 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-59b4f4d478-5b797"] Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.963053 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-2frp2"] Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.965060 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.972765 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-logs\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.972828 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79e89c6-5f56-4439-ad63-a86259d4ed29-logs\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.972865 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl4t9\" (UniqueName: \"kubernetes.io/projected/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-kube-api-access-tl4t9\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.972900 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-combined-ca-bundle\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.972922 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.972941 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.972968 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973004 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973041 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973057 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-config\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973089 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data-custom\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973112 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltmfx\" (UniqueName: \"kubernetes.io/projected/3bd40d51-3ead-4137-9b14-2a93f44f4166-kube-api-access-ltmfx\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973134 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data-custom\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973154 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973182 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-combined-ca-bundle\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973204 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4nrz\" (UniqueName: \"kubernetes.io/projected/b79e89c6-5f56-4439-ad63-a86259d4ed29-kube-api-access-v4nrz\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973532 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-logs\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.988590 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.989147 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-combined-ca-bundle\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.993576 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl4t9\" (UniqueName: \"kubernetes.io/projected/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-kube-api-access-tl4t9\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.997572 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-2frp2"] Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:03.998543 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data-custom\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077650 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79e89c6-5f56-4439-ad63-a86259d4ed29-logs\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077747 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077786 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077832 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077883 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077905 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-config\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077943 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data-custom\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077972 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltmfx\" (UniqueName: \"kubernetes.io/projected/3bd40d51-3ead-4137-9b14-2a93f44f4166-kube-api-access-ltmfx\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077998 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.078036 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-combined-ca-bundle\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.078064 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4nrz\" (UniqueName: \"kubernetes.io/projected/b79e89c6-5f56-4439-ad63-a86259d4ed29-kube-api-access-v4nrz\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.079150 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79e89c6-5f56-4439-ad63-a86259d4ed29-logs\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.079793 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-config\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.082691 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.085162 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.092742 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.095401 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data-custom\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.099036 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.099229 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-combined-ca-bundle\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.103600 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4nrz\" (UniqueName: \"kubernetes.io/projected/b79e89c6-5f56-4439-ad63-a86259d4ed29-kube-api-access-v4nrz\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.113659 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltmfx\" (UniqueName: \"kubernetes.io/projected/3bd40d51-3ead-4137-9b14-2a93f44f4166-kube-api-access-ltmfx\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.117855 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.141101 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.150457 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.172322 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.211442 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5d4754df76-xnl78"] Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.219672 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.222523 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.237003 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d4754df76-xnl78"] Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.309883 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5ffd6fb588-7hftz"] Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.312280 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.316194 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-l2nzr" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.316521 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.316671 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.316814 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.316928 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.322998 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5ffd6fb588-7hftz"] Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.364066 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.368097 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.393570 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-combined-ca-bundle\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.393672 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbbt\" (UniqueName: \"kubernetes.io/projected/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-kube-api-access-fzbbt\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.393748 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.393948 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-logs\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.394025 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data-custom\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496027 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-scripts\") pod \"2772ef82-fe14-4f4d-8349-8ee515e39979\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496098 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-svc\") pod \"ad047cd1-309a-401e-9fc6-cb1349614136\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496221 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2772ef82-fe14-4f4d-8349-8ee515e39979-etc-machine-id\") pod \"2772ef82-fe14-4f4d-8349-8ee515e39979\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496304 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-sb\") pod \"ad047cd1-309a-401e-9fc6-cb1349614136\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496363 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-swift-storage-0\") pod \"ad047cd1-309a-401e-9fc6-cb1349614136\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496442 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp8jc\" (UniqueName: \"kubernetes.io/projected/ad047cd1-309a-401e-9fc6-cb1349614136-kube-api-access-qp8jc\") pod \"ad047cd1-309a-401e-9fc6-cb1349614136\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496492 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-nb\") pod \"ad047cd1-309a-401e-9fc6-cb1349614136\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496546 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-config-data\") pod \"2772ef82-fe14-4f4d-8349-8ee515e39979\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496645 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-db-sync-config-data\") pod \"2772ef82-fe14-4f4d-8349-8ee515e39979\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496729 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-config\") pod \"ad047cd1-309a-401e-9fc6-cb1349614136\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496797 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-combined-ca-bundle\") pod \"2772ef82-fe14-4f4d-8349-8ee515e39979\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496844 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n8zp\" (UniqueName: \"kubernetes.io/projected/2772ef82-fe14-4f4d-8349-8ee515e39979-kube-api-access-5n8zp\") pod \"2772ef82-fe14-4f4d-8349-8ee515e39979\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498313 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-scripts\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498394 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498425 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bd939d8-3b22-4496-acea-ac527f3e5149-logs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498524 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-public-tls-certs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498610 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-logs\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498651 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbrs4\" (UniqueName: \"kubernetes.io/projected/7bd939d8-3b22-4496-acea-ac527f3e5149-kube-api-access-xbrs4\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498713 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data-custom\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498832 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-config-data\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498885 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-internal-tls-certs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498946 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-combined-ca-bundle\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.499003 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-combined-ca-bundle\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.499057 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbbt\" (UniqueName: \"kubernetes.io/projected/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-kube-api-access-fzbbt\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.501356 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2772ef82-fe14-4f4d-8349-8ee515e39979-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2772ef82-fe14-4f4d-8349-8ee515e39979" (UID: "2772ef82-fe14-4f4d-8349-8ee515e39979"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.502814 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-logs\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.506756 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-scripts" (OuterVolumeSpecName: "scripts") pod "2772ef82-fe14-4f4d-8349-8ee515e39979" (UID: "2772ef82-fe14-4f4d-8349-8ee515e39979"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.507873 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2772ef82-fe14-4f4d-8349-8ee515e39979-kube-api-access-5n8zp" (OuterVolumeSpecName: "kube-api-access-5n8zp") pod "2772ef82-fe14-4f4d-8349-8ee515e39979" (UID: "2772ef82-fe14-4f4d-8349-8ee515e39979"). InnerVolumeSpecName "kube-api-access-5n8zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.508352 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2772ef82-fe14-4f4d-8349-8ee515e39979" (UID: "2772ef82-fe14-4f4d-8349-8ee515e39979"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.512848 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad047cd1-309a-401e-9fc6-cb1349614136-kube-api-access-qp8jc" (OuterVolumeSpecName: "kube-api-access-qp8jc") pod "ad047cd1-309a-401e-9fc6-cb1349614136" (UID: "ad047cd1-309a-401e-9fc6-cb1349614136"). InnerVolumeSpecName "kube-api-access-qp8jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.515727 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data-custom\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.518798 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-combined-ca-bundle\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.528690 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbbt\" (UniqueName: \"kubernetes.io/projected/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-kube-api-access-fzbbt\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.529957 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.549886 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.558402 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2772ef82-fe14-4f4d-8349-8ee515e39979" (UID: "2772ef82-fe14-4f4d-8349-8ee515e39979"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.597035 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad047cd1-309a-401e-9fc6-cb1349614136" (UID: "ad047cd1-309a-401e-9fc6-cb1349614136"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.597240 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad047cd1-309a-401e-9fc6-cb1349614136" (UID: "ad047cd1-309a-401e-9fc6-cb1349614136"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600571 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-config-data\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600614 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-internal-tls-certs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600651 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-combined-ca-bundle\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600705 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-scripts\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600734 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bd939d8-3b22-4496-acea-ac527f3e5149-logs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600773 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-public-tls-certs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600806 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbrs4\" (UniqueName: \"kubernetes.io/projected/7bd939d8-3b22-4496-acea-ac527f3e5149-kube-api-access-xbrs4\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600859 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600869 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600879 4816 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2772ef82-fe14-4f4d-8349-8ee515e39979-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600891 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp8jc\" (UniqueName: \"kubernetes.io/projected/ad047cd1-309a-401e-9fc6-cb1349614136-kube-api-access-qp8jc\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600900 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600910 4816 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600919 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600927 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n8zp\" (UniqueName: \"kubernetes.io/projected/2772ef82-fe14-4f4d-8349-8ee515e39979-kube-api-access-5n8zp\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.607319 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bd939d8-3b22-4496-acea-ac527f3e5149-logs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.625884 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-config-data\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.631359 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-internal-tls-certs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.632811 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-scripts\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.634797 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-combined-ca-bundle\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.635710 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-public-tls-certs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.644825 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbrs4\" (UniqueName: \"kubernetes.io/projected/7bd939d8-3b22-4496-acea-ac527f3e5149-kube-api-access-xbrs4\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.680177 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" event={"ID":"ad047cd1-309a-401e-9fc6-cb1349614136","Type":"ContainerDied","Data":"7732a86e8dc12bafbe8cdaac586dd615d3b76e080ef096246aeda54dd0e49383"} Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.681052 4816 scope.go:117] "RemoveContainer" containerID="ccafb95fbf3f12326123ae581a70f3b9eefd2d320c697240864a31290ea2a66c" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.681488 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.689961 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fjmnw" event={"ID":"2772ef82-fe14-4f4d-8349-8ee515e39979","Type":"ContainerDied","Data":"f11050b66cf18643ca807dd8a6fddbe1c30160c5ebaa861b516a6a0d311fa422"} Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.690002 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f11050b66cf18643ca807dd8a6fddbe1c30160c5ebaa861b516a6a0d311fa422" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.690054 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.736826 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-config" (OuterVolumeSpecName: "config") pod "ad047cd1-309a-401e-9fc6-cb1349614136" (UID: "ad047cd1-309a-401e-9fc6-cb1349614136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.736956 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad047cd1-309a-401e-9fc6-cb1349614136" (UID: "ad047cd1-309a-401e-9fc6-cb1349614136"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.737155 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ad047cd1-309a-401e-9fc6-cb1349614136" (UID: "ad047cd1-309a-401e-9fc6-cb1349614136"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.743374 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-config-data" (OuterVolumeSpecName: "config-data") pod "2772ef82-fe14-4f4d-8349-8ee515e39979" (UID: "2772ef82-fe14-4f4d-8349-8ee515e39979"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.805220 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.805636 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.805647 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.805657 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.932835 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.025861 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:05 crc kubenswrapper[4816]: E0311 12:19:05.026477 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" containerName="dnsmasq-dns" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.026502 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" containerName="dnsmasq-dns" Mar 11 12:19:05 crc kubenswrapper[4816]: E0311 12:19:05.026552 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2772ef82-fe14-4f4d-8349-8ee515e39979" containerName="cinder-db-sync" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.026563 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2772ef82-fe14-4f4d-8349-8ee515e39979" containerName="cinder-db-sync" Mar 11 12:19:05 crc kubenswrapper[4816]: E0311 12:19:05.026578 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" containerName="init" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.026587 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" containerName="init" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.026836 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2772ef82-fe14-4f4d-8349-8ee515e39979" containerName="cinder-db-sync" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.026886 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" containerName="dnsmasq-dns" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.028369 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.037123 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.037532 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.037565 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6qw4t" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.037871 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.067693 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.104601 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-2frp2"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.119187 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.119240 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z98gf\" (UniqueName: \"kubernetes.io/projected/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-kube-api-access-z98gf\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.119321 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.119346 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.119369 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.119407 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.119504 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-2nfvt"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.135470 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-2nfvt"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.147231 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-7gcck"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.148935 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.166338 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-7gcck"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.192641 4816 scope.go:117] "RemoveContainer" containerID="f842cab6fbb753d4036f93abfc735f41fd91ab93fdcba8e10b330c30d7aa8346" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.201682 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-59b4f4d478-5b797"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.221491 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.221552 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z98gf\" (UniqueName: \"kubernetes.io/projected/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-kube-api-access-z98gf\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.221634 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.221657 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.221680 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.221719 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.222106 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.232394 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.233196 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.237753 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.238296 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.239467 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-2frp2"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.249053 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z98gf\" (UniqueName: \"kubernetes.io/projected/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-kube-api-access-z98gf\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.249186 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-855897fd55-t7sfb"] Mar 11 12:19:05 crc kubenswrapper[4816]: W0311 12:19:05.254449 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddd535a1_7585_4cb7_94ec_f4b98b10be4a.slice/crio-84c045541bc73afd53de86393645863c006080b89347feb36d269d40b0b6ac28 WatchSource:0}: Error finding container 84c045541bc73afd53de86393645863c006080b89347feb36d269d40b0b6ac28: Status 404 returned error can't find the container with id 84c045541bc73afd53de86393645863c006080b89347feb36d269d40b0b6ac28 Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.326503 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.337025 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.337385 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-config\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.337431 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.337467 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.337499 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwh8q\" (UniqueName: \"kubernetes.io/projected/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-kube-api-access-kwh8q\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.379833 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.380337 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.393176 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.399933 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.400516 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443437 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data-custom\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443511 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwzct\" (UniqueName: \"kubernetes.io/projected/43eac2c3-bace-4682-b48e-f063d6653733-kube-api-access-vwzct\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443564 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443596 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443624 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eac2c3-bace-4682-b48e-f063d6653733-logs\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443675 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443705 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-config\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443720 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443736 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443755 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwh8q\" (UniqueName: \"kubernetes.io/projected/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-kube-api-access-kwh8q\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443774 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43eac2c3-bace-4682-b48e-f063d6653733-etc-machine-id\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443792 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-scripts\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443811 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.444866 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.445545 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.446581 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-config\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.447351 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.450629 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.467701 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwh8q\" (UniqueName: \"kubernetes.io/projected/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-kube-api-access-kwh8q\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.546290 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43eac2c3-bace-4682-b48e-f063d6653733-etc-machine-id\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.546856 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-scripts\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.546901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.546960 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data-custom\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.547002 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwzct\" (UniqueName: \"kubernetes.io/projected/43eac2c3-bace-4682-b48e-f063d6653733-kube-api-access-vwzct\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.547092 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eac2c3-bace-4682-b48e-f063d6653733-logs\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.547156 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.546475 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43eac2c3-bace-4682-b48e-f063d6653733-etc-machine-id\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.548515 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eac2c3-bace-4682-b48e-f063d6653733-logs\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.555336 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-scripts\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.559784 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.562854 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.563436 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data-custom\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.573841 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwzct\" (UniqueName: \"kubernetes.io/projected/43eac2c3-bace-4682-b48e-f063d6653733-kube-api-access-vwzct\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.614514 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.732115 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" event={"ID":"3bd40d51-3ead-4137-9b14-2a93f44f4166","Type":"ContainerStarted","Data":"e340351756a9ee01fd1961ba595c2cad8bbf26c5f081172dd9ef510e6ebc5cd5"} Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.737916 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.758807 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-855897fd55-t7sfb" event={"ID":"b79e89c6-5f56-4439-ad63-a86259d4ed29","Type":"ContainerStarted","Data":"65e8dd7e6335c0228a44e94f23c28e5cede1dd965bd20e6b4cf61bc69bb5386a"} Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.760080 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" event={"ID":"ddd535a1-7585-4cb7-94ec-f4b98b10be4a","Type":"ContainerStarted","Data":"84c045541bc73afd53de86393645863c006080b89347feb36d269d40b0b6ac28"} Mar 11 12:19:05 crc kubenswrapper[4816]: W0311 12:19:05.984747 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba61db44_272d_4f1c_b3c6_d3fe1edb38bd.slice/crio-9cc4c282c9e0a53abd8b5254615b71e35bd7cba821c5895a1166b86769ee9a4f WatchSource:0}: Error finding container 9cc4c282c9e0a53abd8b5254615b71e35bd7cba821c5895a1166b86769ee9a4f: Status 404 returned error can't find the container with id 9cc4c282c9e0a53abd8b5254615b71e35bd7cba821c5895a1166b86769ee9a4f Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.020845 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d4754df76-xnl78"] Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.167585 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" path="/var/lib/kubelet/pods/ad047cd1-309a-401e-9fc6-cb1349614136/volumes" Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.169129 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5ffd6fb588-7hftz"] Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.227793 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:06 crc kubenswrapper[4816]: W0311 12:19:06.234969 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8f76a92_4234_474b_bca2_f5d9cbbec8f2.slice/crio-965649e801e9747f00b8263a5feeac3e050a872f0198fc9e14a40e62b55571cf WatchSource:0}: Error finding container 965649e801e9747f00b8263a5feeac3e050a872f0198fc9e14a40e62b55571cf: Status 404 returned error can't find the container with id 965649e801e9747f00b8263a5feeac3e050a872f0198fc9e14a40e62b55571cf Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.678847 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.711523 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-7gcck"] Mar 11 12:19:06 crc kubenswrapper[4816]: W0311 12:19:06.756266 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f7f295b_c30d_49a7_b5fa_b1ae8f705589.slice/crio-22b1daa75682bd6ac40d3753e3d1220fc2183f782012b1a65bc963f4cb8ba7ec WatchSource:0}: Error finding container 22b1daa75682bd6ac40d3753e3d1220fc2183f782012b1a65bc963f4cb8ba7ec: Status 404 returned error can't find the container with id 22b1daa75682bd6ac40d3753e3d1220fc2183f782012b1a65bc963f4cb8ba7ec Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.799547 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerStarted","Data":"56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a"} Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.799818 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-central-agent" containerID="cri-o://2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf" gracePeriod=30 Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.800339 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.800713 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-notification-agent" containerID="cri-o://785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde" gracePeriod=30 Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.800762 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="sg-core" containerID="cri-o://fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0" gracePeriod=30 Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.800812 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="proxy-httpd" containerID="cri-o://56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a" gracePeriod=30 Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.816636 4816 generic.go:334] "Generic (PLEG): container finished" podID="3bd40d51-3ead-4137-9b14-2a93f44f4166" containerID="976f0996d7f32cec3b2ed81142b0919faa0b23eb7e4ba00fa314a16f1166512f" exitCode=0 Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.816861 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" event={"ID":"3bd40d51-3ead-4137-9b14-2a93f44f4166","Type":"ContainerDied","Data":"976f0996d7f32cec3b2ed81142b0919faa0b23eb7e4ba00fa314a16f1166512f"} Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.826326 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.272447595 podStartE2EDuration="54.826308036s" podCreationTimestamp="2026-03-11 12:18:12 +0000 UTC" firstStartedPulling="2026-03-11 12:18:14.841785745 +0000 UTC m=+1181.433049712" lastFinishedPulling="2026-03-11 12:19:05.395646186 +0000 UTC m=+1231.986910153" observedRunningTime="2026-03-11 12:19:06.821695334 +0000 UTC m=+1233.412959301" watchObservedRunningTime="2026-03-11 12:19:06.826308036 +0000 UTC m=+1233.417572003" Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.837197 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d4754df76-xnl78" event={"ID":"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd","Type":"ContainerStarted","Data":"5ba8a8c2543ebb94e1b68f6aeb2566f2e416672e55badbfef9432d4a75b3a2bf"} Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.837273 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d4754df76-xnl78" event={"ID":"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd","Type":"ContainerStarted","Data":"9cc4c282c9e0a53abd8b5254615b71e35bd7cba821c5895a1166b86769ee9a4f"} Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.869136 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" event={"ID":"1f7f295b-c30d-49a7-b5fa-b1ae8f705589","Type":"ContainerStarted","Data":"22b1daa75682bd6ac40d3753e3d1220fc2183f782012b1a65bc963f4cb8ba7ec"} Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.876026 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43eac2c3-bace-4682-b48e-f063d6653733","Type":"ContainerStarted","Data":"d1d101cb43433bc7eb7c833f258e91530ee7e5c09a0712cf4851d690643adb2a"} Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.882515 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6fb588-7hftz" event={"ID":"7bd939d8-3b22-4496-acea-ac527f3e5149","Type":"ContainerStarted","Data":"584cd4107522305bdba692719070a92eec3324ee2da427663b64c0c877cbea0c"} Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.887961 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a8f76a92-4234-474b-bca2-f5d9cbbec8f2","Type":"ContainerStarted","Data":"965649e801e9747f00b8263a5feeac3e050a872f0198fc9e14a40e62b55571cf"} Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.620269 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.705684 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-config\") pod \"3bd40d51-3ead-4137-9b14-2a93f44f4166\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.705744 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltmfx\" (UniqueName: \"kubernetes.io/projected/3bd40d51-3ead-4137-9b14-2a93f44f4166-kube-api-access-ltmfx\") pod \"3bd40d51-3ead-4137-9b14-2a93f44f4166\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.705868 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-svc\") pod \"3bd40d51-3ead-4137-9b14-2a93f44f4166\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.705928 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-nb\") pod \"3bd40d51-3ead-4137-9b14-2a93f44f4166\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.706038 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-sb\") pod \"3bd40d51-3ead-4137-9b14-2a93f44f4166\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.706074 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-swift-storage-0\") pod \"3bd40d51-3ead-4137-9b14-2a93f44f4166\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.718128 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd40d51-3ead-4137-9b14-2a93f44f4166-kube-api-access-ltmfx" (OuterVolumeSpecName: "kube-api-access-ltmfx") pod "3bd40d51-3ead-4137-9b14-2a93f44f4166" (UID: "3bd40d51-3ead-4137-9b14-2a93f44f4166"). InnerVolumeSpecName "kube-api-access-ltmfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.809031 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.821133 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltmfx\" (UniqueName: \"kubernetes.io/projected/3bd40d51-3ead-4137-9b14-2a93f44f4166-kube-api-access-ltmfx\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.835695 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3bd40d51-3ead-4137-9b14-2a93f44f4166" (UID: "3bd40d51-3ead-4137-9b14-2a93f44f4166"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.881822 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3bd40d51-3ead-4137-9b14-2a93f44f4166" (UID: "3bd40d51-3ead-4137-9b14-2a93f44f4166"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.884944 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3bd40d51-3ead-4137-9b14-2a93f44f4166" (UID: "3bd40d51-3ead-4137-9b14-2a93f44f4166"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.910764 4816 generic.go:334] "Generic (PLEG): container finished" podID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerID="56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a" exitCode=0 Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.910993 4816 generic.go:334] "Generic (PLEG): container finished" podID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerID="fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0" exitCode=2 Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.911005 4816 generic.go:334] "Generic (PLEG): container finished" podID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerID="2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf" exitCode=0 Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.910962 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerDied","Data":"56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a"} Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.911064 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerDied","Data":"fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0"} Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.911074 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerDied","Data":"2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf"} Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.913847 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" event={"ID":"3bd40d51-3ead-4137-9b14-2a93f44f4166","Type":"ContainerDied","Data":"e340351756a9ee01fd1961ba595c2cad8bbf26c5f081172dd9ef510e6ebc5cd5"} Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.913889 4816 scope.go:117] "RemoveContainer" containerID="976f0996d7f32cec3b2ed81142b0919faa0b23eb7e4ba00fa314a16f1166512f" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.914049 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.915884 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-config" (OuterVolumeSpecName: "config") pod "3bd40d51-3ead-4137-9b14-2a93f44f4166" (UID: "3bd40d51-3ead-4137-9b14-2a93f44f4166"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.917363 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d4754df76-xnl78" event={"ID":"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd","Type":"ContainerStarted","Data":"eaee8f2b001ecac77cd66b481deeba2cae3b59ceedc01017e976649a89d1fa8d"} Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.918371 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.918399 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.923018 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6fb588-7hftz" event={"ID":"7bd939d8-3b22-4496-acea-ac527f3e5149","Type":"ContainerStarted","Data":"3acd68e155620ecc4260fb5ba2dfe8af8d211b5066fc4c67c7f8658e47beb43f"} Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.923602 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3bd40d51-3ead-4137-9b14-2a93f44f4166" (UID: "3bd40d51-3ead-4137-9b14-2a93f44f4166"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.928332 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.928370 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.928387 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.928399 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.928409 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.950690 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5d4754df76-xnl78" podStartSLOduration=3.950660793 podStartE2EDuration="3.950660793s" podCreationTimestamp="2026-03-11 12:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:07.940888024 +0000 UTC m=+1234.532151991" watchObservedRunningTime="2026-03-11 12:19:07.950660793 +0000 UTC m=+1234.541924760" Mar 11 12:19:08 crc kubenswrapper[4816]: I0311 12:19:08.287551 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-2frp2"] Mar 11 12:19:08 crc kubenswrapper[4816]: I0311 12:19:08.287819 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-2frp2"] Mar 11 12:19:08 crc kubenswrapper[4816]: I0311 12:19:08.863408 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.148:5353: i/o timeout" Mar 11 12:19:08 crc kubenswrapper[4816]: I0311 12:19:08.939301 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerID="93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603" exitCode=0 Mar 11 12:19:08 crc kubenswrapper[4816]: I0311 12:19:08.939366 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" event={"ID":"1f7f295b-c30d-49a7-b5fa-b1ae8f705589","Type":"ContainerDied","Data":"93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603"} Mar 11 12:19:08 crc kubenswrapper[4816]: I0311 12:19:08.942414 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43eac2c3-bace-4682-b48e-f063d6653733","Type":"ContainerStarted","Data":"346170c8c6b811872540539f7b2570fc326b6427186b6e7d7e167645153015dd"} Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.514926 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.515461 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.515516 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.516478 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92bc406893843c03ac9aa6138b10c838c501d62aa37baf4b9b92254baf796e96"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.516529 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://92bc406893843c03ac9aa6138b10c838c501d62aa37baf4b9b92254baf796e96" gracePeriod=600 Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.959678 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6fb588-7hftz" event={"ID":"7bd939d8-3b22-4496-acea-ac527f3e5149","Type":"ContainerStarted","Data":"6309388e250c5434fd6b39ddce96cacd594c9880dd57d2c9e89074cac30a961b"} Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.961335 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.961357 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.963420 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a8f76a92-4234-474b-bca2-f5d9cbbec8f2","Type":"ContainerStarted","Data":"cd4f48e197b603f62e0bf20ff6682b81e7b6286b2b2f453bd8e109136a0f4621"} Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.967411 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-855897fd55-t7sfb" event={"ID":"b79e89c6-5f56-4439-ad63-a86259d4ed29","Type":"ContainerStarted","Data":"4e741a528a024acf7a27b5a7253bef28cff4a22ea41c625ba24158e8c7be76eb"} Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.967450 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-855897fd55-t7sfb" event={"ID":"b79e89c6-5f56-4439-ad63-a86259d4ed29","Type":"ContainerStarted","Data":"f675def681ebf7bc955ad7437f5bae6532f22f4db4a832aa48a182650e749af2"} Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.970988 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="92bc406893843c03ac9aa6138b10c838c501d62aa37baf4b9b92254baf796e96" exitCode=0 Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.971458 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"92bc406893843c03ac9aa6138b10c838c501d62aa37baf4b9b92254baf796e96"} Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.971496 4816 scope.go:117] "RemoveContainer" containerID="13e7eed3f44dcb7bba59d21f6a1bb4bc9f4b869b7a25106a79ff8ceef1b9e507" Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.983484 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5ffd6fb588-7hftz" podStartSLOduration=5.983466058 podStartE2EDuration="5.983466058s" podCreationTimestamp="2026-03-11 12:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:09.982970864 +0000 UTC m=+1236.574234841" watchObservedRunningTime="2026-03-11 12:19:09.983466058 +0000 UTC m=+1236.574730015" Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.989800 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" event={"ID":"ddd535a1-7585-4cb7-94ec-f4b98b10be4a","Type":"ContainerStarted","Data":"a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207"} Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.989839 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" event={"ID":"ddd535a1-7585-4cb7-94ec-f4b98b10be4a","Type":"ContainerStarted","Data":"5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85"} Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.000354 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" event={"ID":"1f7f295b-c30d-49a7-b5fa-b1ae8f705589","Type":"ContainerStarted","Data":"2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2"} Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.002491 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.022372 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api-log" containerID="cri-o://346170c8c6b811872540539f7b2570fc326b6427186b6e7d7e167645153015dd" gracePeriod=30 Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.022656 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43eac2c3-bace-4682-b48e-f063d6653733","Type":"ContainerStarted","Data":"0dc3816fea03c51cbbb58023865a3dee996cbbc76475be49172b8d011f579193"} Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.023163 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.022704 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api" containerID="cri-o://0dc3816fea03c51cbbb58023865a3dee996cbbc76475be49172b8d011f579193" gracePeriod=30 Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.044568 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-855897fd55-t7sfb" podStartSLOduration=3.327157953 podStartE2EDuration="7.044548663s" podCreationTimestamp="2026-03-11 12:19:03 +0000 UTC" firstStartedPulling="2026-03-11 12:19:05.305465149 +0000 UTC m=+1231.896729116" lastFinishedPulling="2026-03-11 12:19:09.022855839 +0000 UTC m=+1235.614119826" observedRunningTime="2026-03-11 12:19:10.020457535 +0000 UTC m=+1236.611721502" watchObservedRunningTime="2026-03-11 12:19:10.044548663 +0000 UTC m=+1236.635812630" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.067041 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" podStartSLOduration=3.354487383 podStartE2EDuration="7.066992004s" podCreationTimestamp="2026-03-11 12:19:03 +0000 UTC" firstStartedPulling="2026-03-11 12:19:05.310168493 +0000 UTC m=+1231.901432460" lastFinishedPulling="2026-03-11 12:19:09.022673114 +0000 UTC m=+1235.613937081" observedRunningTime="2026-03-11 12:19:10.042184986 +0000 UTC m=+1236.633448953" watchObservedRunningTime="2026-03-11 12:19:10.066992004 +0000 UTC m=+1236.658255961" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.166975 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd40d51-3ead-4137-9b14-2a93f44f4166" path="/var/lib/kubelet/pods/3bd40d51-3ead-4137-9b14-2a93f44f4166/volumes" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.171767 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.171741748 podStartE2EDuration="5.171741748s" podCreationTimestamp="2026-03-11 12:19:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:10.143806169 +0000 UTC m=+1236.735070146" watchObservedRunningTime="2026-03-11 12:19:10.171741748 +0000 UTC m=+1236.763005715" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.175361 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" podStartSLOduration=5.17534163 podStartE2EDuration="5.17534163s" podCreationTimestamp="2026-03-11 12:19:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:10.101699816 +0000 UTC m=+1236.692963803" watchObservedRunningTime="2026-03-11 12:19:10.17534163 +0000 UTC m=+1236.766605597" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.375360 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-64b59f8d4-2vxd9"] Mar 11 12:19:10 crc kubenswrapper[4816]: E0311 12:19:10.376622 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd40d51-3ead-4137-9b14-2a93f44f4166" containerName="init" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.376781 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd40d51-3ead-4137-9b14-2a93f44f4166" containerName="init" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.377271 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd40d51-3ead-4137-9b14-2a93f44f4166" containerName="init" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.378767 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.385970 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.386558 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.423015 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-64b59f8d4-2vxd9"] Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.543848 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-combined-ca-bundle\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.543906 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-internal-tls-certs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.544050 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.544128 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgm8j\" (UniqueName: \"kubernetes.io/projected/7795071e-2de0-43cb-b225-cfed54570d94-kube-api-access-mgm8j\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.544153 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data-custom\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.544202 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-public-tls-certs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.544222 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7795071e-2de0-43cb-b225-cfed54570d94-logs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.646066 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-combined-ca-bundle\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.646126 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-internal-tls-certs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.646226 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.646309 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgm8j\" (UniqueName: \"kubernetes.io/projected/7795071e-2de0-43cb-b225-cfed54570d94-kube-api-access-mgm8j\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.646330 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data-custom\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.646376 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-public-tls-certs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.646393 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7795071e-2de0-43cb-b225-cfed54570d94-logs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.647033 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7795071e-2de0-43cb-b225-cfed54570d94-logs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.659815 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data-custom\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.660358 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-internal-tls-certs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.660907 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-combined-ca-bundle\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.670170 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-public-tls-certs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.685545 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.693836 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgm8j\" (UniqueName: \"kubernetes.io/projected/7795071e-2de0-43cb-b225-cfed54570d94-kube-api-access-mgm8j\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.743689 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.763004 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.852501 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-config-data\") pod \"1ebe3f2a-5719-412c-8803-15e1bec74523\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.852555 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-log-httpd\") pod \"1ebe3f2a-5719-412c-8803-15e1bec74523\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.852609 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnpcx\" (UniqueName: \"kubernetes.io/projected/1ebe3f2a-5719-412c-8803-15e1bec74523-kube-api-access-wnpcx\") pod \"1ebe3f2a-5719-412c-8803-15e1bec74523\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.852645 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-sg-core-conf-yaml\") pod \"1ebe3f2a-5719-412c-8803-15e1bec74523\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.852674 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-combined-ca-bundle\") pod \"1ebe3f2a-5719-412c-8803-15e1bec74523\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.852733 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-scripts\") pod \"1ebe3f2a-5719-412c-8803-15e1bec74523\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.852853 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-run-httpd\") pod \"1ebe3f2a-5719-412c-8803-15e1bec74523\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.853742 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1ebe3f2a-5719-412c-8803-15e1bec74523" (UID: "1ebe3f2a-5719-412c-8803-15e1bec74523"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.854520 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.855193 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1ebe3f2a-5719-412c-8803-15e1bec74523" (UID: "1ebe3f2a-5719-412c-8803-15e1bec74523"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.865468 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ebe3f2a-5719-412c-8803-15e1bec74523-kube-api-access-wnpcx" (OuterVolumeSpecName: "kube-api-access-wnpcx") pod "1ebe3f2a-5719-412c-8803-15e1bec74523" (UID: "1ebe3f2a-5719-412c-8803-15e1bec74523"). InnerVolumeSpecName "kube-api-access-wnpcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.865946 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-scripts" (OuterVolumeSpecName: "scripts") pod "1ebe3f2a-5719-412c-8803-15e1bec74523" (UID: "1ebe3f2a-5719-412c-8803-15e1bec74523"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.927984 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1ebe3f2a-5719-412c-8803-15e1bec74523" (UID: "1ebe3f2a-5719-412c-8803-15e1bec74523"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.956381 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnpcx\" (UniqueName: \"kubernetes.io/projected/1ebe3f2a-5719-412c-8803-15e1bec74523-kube-api-access-wnpcx\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.956429 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.956441 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.956457 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.960074 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ebe3f2a-5719-412c-8803-15e1bec74523" (UID: "1ebe3f2a-5719-412c-8803-15e1bec74523"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.986417 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-config-data" (OuterVolumeSpecName: "config-data") pod "1ebe3f2a-5719-412c-8803-15e1bec74523" (UID: "1ebe3f2a-5719-412c-8803-15e1bec74523"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.032909 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"20e5352a1f18de3da65279dced0572d988bf4c64c45f769d6d0ae47f9c2cef9a"} Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.036025 4816 generic.go:334] "Generic (PLEG): container finished" podID="43eac2c3-bace-4682-b48e-f063d6653733" containerID="346170c8c6b811872540539f7b2570fc326b6427186b6e7d7e167645153015dd" exitCode=143 Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.036099 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43eac2c3-bace-4682-b48e-f063d6653733","Type":"ContainerDied","Data":"346170c8c6b811872540539f7b2570fc326b6427186b6e7d7e167645153015dd"} Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.043623 4816 generic.go:334] "Generic (PLEG): container finished" podID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerID="785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde" exitCode=0 Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.044370 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerDied","Data":"785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde"} Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.044421 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerDied","Data":"8ef644ece8e49d46c6b3d18fe5c5f96913f607e9b6a202c08e5f7ee442c27c93"} Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.044446 4816 scope.go:117] "RemoveContainer" containerID="56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.044674 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.059081 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.059126 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.145491 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.158845 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171066 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:11 crc kubenswrapper[4816]: E0311 12:19:11.171506 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-central-agent" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171526 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-central-agent" Mar 11 12:19:11 crc kubenswrapper[4816]: E0311 12:19:11.171539 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-notification-agent" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171546 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-notification-agent" Mar 11 12:19:11 crc kubenswrapper[4816]: E0311 12:19:11.171564 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="sg-core" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171572 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="sg-core" Mar 11 12:19:11 crc kubenswrapper[4816]: E0311 12:19:11.171588 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="proxy-httpd" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171594 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="proxy-httpd" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171762 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-central-agent" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171794 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-notification-agent" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171812 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="sg-core" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171823 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="proxy-httpd" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.173452 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.176375 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.177270 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.188672 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.265785 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.265893 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsbml\" (UniqueName: \"kubernetes.io/projected/10e3f184-9109-4af7-8ca6-822379e0c513-kube-api-access-hsbml\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.265940 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-config-data\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.265983 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-scripts\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.266011 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-log-httpd\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.266066 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.266229 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-run-httpd\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.367737 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.367804 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsbml\" (UniqueName: \"kubernetes.io/projected/10e3f184-9109-4af7-8ca6-822379e0c513-kube-api-access-hsbml\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.367828 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-config-data\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.367852 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-scripts\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.367871 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-log-httpd\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.367903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.367986 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-run-httpd\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.368535 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-run-httpd\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.369260 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-log-httpd\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.375714 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.376788 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-config-data\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.377049 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.377869 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-scripts\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.404963 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsbml\" (UniqueName: \"kubernetes.io/projected/10e3f184-9109-4af7-8ca6-822379e0c513-kube-api-access-hsbml\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.516798 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.895117 4816 scope.go:117] "RemoveContainer" containerID="fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.025529 4816 scope.go:117] "RemoveContainer" containerID="785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.140843 4816 scope.go:117] "RemoveContainer" containerID="2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.187163 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" path="/var/lib/kubelet/pods/1ebe3f2a-5719-412c-8803-15e1bec74523/volumes" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.187915 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a8f76a92-4234-474b-bca2-f5d9cbbec8f2","Type":"ContainerStarted","Data":"e6365a9b05307ab0d08f2be43b5fd09d02f68def8b266305137ad18cede98778"} Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.234496 4816 scope.go:117] "RemoveContainer" containerID="56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a" Mar 11 12:19:12 crc kubenswrapper[4816]: E0311 12:19:12.242755 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a\": container with ID starting with 56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a not found: ID does not exist" containerID="56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.242813 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a"} err="failed to get container status \"56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a\": rpc error: code = NotFound desc = could not find container \"56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a\": container with ID starting with 56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a not found: ID does not exist" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.242886 4816 scope.go:117] "RemoveContainer" containerID="fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0" Mar 11 12:19:12 crc kubenswrapper[4816]: E0311 12:19:12.243183 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0\": container with ID starting with fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0 not found: ID does not exist" containerID="fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.243205 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0"} err="failed to get container status \"fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0\": rpc error: code = NotFound desc = could not find container \"fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0\": container with ID starting with fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0 not found: ID does not exist" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.243225 4816 scope.go:117] "RemoveContainer" containerID="785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde" Mar 11 12:19:12 crc kubenswrapper[4816]: E0311 12:19:12.243422 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde\": container with ID starting with 785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde not found: ID does not exist" containerID="785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.243474 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde"} err="failed to get container status \"785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde\": rpc error: code = NotFound desc = could not find container \"785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde\": container with ID starting with 785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde not found: ID does not exist" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.243499 4816 scope.go:117] "RemoveContainer" containerID="2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf" Mar 11 12:19:12 crc kubenswrapper[4816]: E0311 12:19:12.244822 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf\": container with ID starting with 2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf not found: ID does not exist" containerID="2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.244841 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf"} err="failed to get container status \"2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf\": rpc error: code = NotFound desc = could not find container \"2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf\": container with ID starting with 2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf not found: ID does not exist" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.473692 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.760421674 podStartE2EDuration="8.473666712s" podCreationTimestamp="2026-03-11 12:19:04 +0000 UTC" firstStartedPulling="2026-03-11 12:19:06.241188346 +0000 UTC m=+1232.832452313" lastFinishedPulling="2026-03-11 12:19:08.954433384 +0000 UTC m=+1235.545697351" observedRunningTime="2026-03-11 12:19:12.197001767 +0000 UTC m=+1238.788265734" watchObservedRunningTime="2026-03-11 12:19:12.473666712 +0000 UTC m=+1239.064930679" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.474580 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-64b59f8d4-2vxd9"] Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.636834 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:12 crc kubenswrapper[4816]: W0311 12:19:12.648760 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10e3f184_9109_4af7_8ca6_822379e0c513.slice/crio-6d281821384131e14e507eeaf976f8558feb01527e87cd3779946b65388e3bc7 WatchSource:0}: Error finding container 6d281821384131e14e507eeaf976f8558feb01527e87cd3779946b65388e3bc7: Status 404 returned error can't find the container with id 6d281821384131e14e507eeaf976f8558feb01527e87cd3779946b65388e3bc7 Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.973569 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:13 crc kubenswrapper[4816]: I0311 12:19:13.169143 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b59f8d4-2vxd9" event={"ID":"7795071e-2de0-43cb-b225-cfed54570d94","Type":"ContainerStarted","Data":"5e19f1840cfd8f7623e64404579f814579ee6602ca765f964613a90342b26cc2"} Mar 11 12:19:13 crc kubenswrapper[4816]: I0311 12:19:13.169197 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b59f8d4-2vxd9" event={"ID":"7795071e-2de0-43cb-b225-cfed54570d94","Type":"ContainerStarted","Data":"8dc2306ac32e5d795143d562064f5d8e129c4815490ca1bada6d8509ddcc5240"} Mar 11 12:19:13 crc kubenswrapper[4816]: I0311 12:19:13.169212 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b59f8d4-2vxd9" event={"ID":"7795071e-2de0-43cb-b225-cfed54570d94","Type":"ContainerStarted","Data":"9de7e47c0f14568909f59552b05e938af6254c4c9840ec07004683a8c3fa16e2"} Mar 11 12:19:13 crc kubenswrapper[4816]: I0311 12:19:13.169277 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:13 crc kubenswrapper[4816]: I0311 12:19:13.169310 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:13 crc kubenswrapper[4816]: I0311 12:19:13.175456 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerStarted","Data":"6d281821384131e14e507eeaf976f8558feb01527e87cd3779946b65388e3bc7"} Mar 11 12:19:13 crc kubenswrapper[4816]: I0311 12:19:13.202157 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-64b59f8d4-2vxd9" podStartSLOduration=3.202115717 podStartE2EDuration="3.202115717s" podCreationTimestamp="2026-03-11 12:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:13.200325836 +0000 UTC m=+1239.791589833" watchObservedRunningTime="2026-03-11 12:19:13.202115717 +0000 UTC m=+1239.793379684" Mar 11 12:19:14 crc kubenswrapper[4816]: I0311 12:19:14.195849 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerStarted","Data":"5ea55c5fdec26a804e311808a0dab722dc704515cc19343dfae8f51e1980dcdf"} Mar 11 12:19:14 crc kubenswrapper[4816]: I0311 12:19:14.197489 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerStarted","Data":"64d1dc2db1be47dc15a33d606bf556173a42132151a3b69e28ce73757040e831"} Mar 11 12:19:15 crc kubenswrapper[4816]: I0311 12:19:15.208498 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerStarted","Data":"892ed54ab6b1b8e78f2c10457a1ac792f459dfcc72db435ed64164634c50c4f4"} Mar 11 12:19:15 crc kubenswrapper[4816]: I0311 12:19:15.382627 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 11 12:19:15 crc kubenswrapper[4816]: I0311 12:19:15.617449 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:15 crc kubenswrapper[4816]: I0311 12:19:15.729115 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-2w84f"] Mar 11 12:19:15 crc kubenswrapper[4816]: I0311 12:19:15.729372 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" podUID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerName="dnsmasq-dns" containerID="cri-o://ec1bad5250db6f7f6e52f756f22dc65681cf048e7705d228c0b2aebf5f68f5e9" gracePeriod=10 Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.092757 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.243498 4816 generic.go:334] "Generic (PLEG): container finished" podID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerID="ec1bad5250db6f7f6e52f756f22dc65681cf048e7705d228c0b2aebf5f68f5e9" exitCode=0 Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.243563 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" event={"ID":"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c","Type":"ContainerDied","Data":"ec1bad5250db6f7f6e52f756f22dc65681cf048e7705d228c0b2aebf5f68f5e9"} Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.307792 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.563352 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.708385 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-swift-storage-0\") pod \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.708468 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-config\") pod \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.708549 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-svc\") pod \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.708644 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqnqd\" (UniqueName: \"kubernetes.io/projected/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-kube-api-access-mqnqd\") pod \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.708682 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-nb\") pod \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.708740 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-sb\") pod \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.752707 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-kube-api-access-mqnqd" (OuterVolumeSpecName: "kube-api-access-mqnqd") pod "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" (UID: "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c"). InnerVolumeSpecName "kube-api-access-mqnqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.813173 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqnqd\" (UniqueName: \"kubernetes.io/projected/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-kube-api-access-mqnqd\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.824146 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" (UID: "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.835857 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-config" (OuterVolumeSpecName: "config") pod "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" (UID: "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.851988 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" (UID: "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.883813 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" (UID: "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.896723 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" (UID: "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.907312 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.914726 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.914754 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.914764 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.914775 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.914786 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.067588 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.255143 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerStarted","Data":"b0f2ba98772ce0d4c1de918f6b5eca0c46d92b8201207a117fc19c82c71e70f3"} Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.256334 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.265487 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.265895 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" event={"ID":"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c","Type":"ContainerDied","Data":"88fbfecb363955a5809ac97d8f060b762763c43823be00b756985a39bffcbe7e"} Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.265937 4816 scope.go:117] "RemoveContainer" containerID="ec1bad5250db6f7f6e52f756f22dc65681cf048e7705d228c0b2aebf5f68f5e9" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.266154 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="cinder-scheduler" containerID="cri-o://cd4f48e197b603f62e0bf20ff6682b81e7b6286b2b2f453bd8e109136a0f4621" gracePeriod=30 Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.266419 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="probe" containerID="cri-o://e6365a9b05307ab0d08f2be43b5fd09d02f68def8b266305137ad18cede98778" gracePeriod=30 Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.310635 4816 scope.go:117] "RemoveContainer" containerID="f365943c3bcd25f6e7decbae194b1841bae20d3a5ca1816848e3f40bb39b1c41" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.320087 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.215760558 podStartE2EDuration="6.320061203s" podCreationTimestamp="2026-03-11 12:19:11 +0000 UTC" firstStartedPulling="2026-03-11 12:19:12.657058893 +0000 UTC m=+1239.248322860" lastFinishedPulling="2026-03-11 12:19:16.761359538 +0000 UTC m=+1243.352623505" observedRunningTime="2026-03-11 12:19:17.300606777 +0000 UTC m=+1243.891870744" watchObservedRunningTime="2026-03-11 12:19:17.320061203 +0000 UTC m=+1243.911325170" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.341153 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-2w84f"] Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.372197 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-2w84f"] Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.661593 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.933976 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-64584d7649-mb6k8"] Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.934421 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-64584d7649-mb6k8" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-api" containerID="cri-o://06ebd4a2da9305c5f9303396efc2a80f0ef4ae2462b9e8b47545883c85f3c658" gracePeriod=30 Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.934496 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-64584d7649-mb6k8" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-httpd" containerID="cri-o://ed06a5d04ea24da7b7022266f3b93adfbbc7a80293e5752545ee9f6add12458d" gracePeriod=30 Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.947493 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-64584d7649-mb6k8" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": EOF" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.992963 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6867c6dbc5-lzgfd"] Mar 11 12:19:17 crc kubenswrapper[4816]: E0311 12:19:17.993834 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerName="init" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.993859 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerName="init" Mar 11 12:19:17 crc kubenswrapper[4816]: E0311 12:19:17.993880 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerName="dnsmasq-dns" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.993890 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerName="dnsmasq-dns" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.994769 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerName="dnsmasq-dns" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.996476 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.017003 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6867c6dbc5-lzgfd"] Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.141069 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" path="/var/lib/kubelet/pods/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c/volumes" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.144548 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-public-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.144625 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-httpd-config\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.144718 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7fr9\" (UniqueName: \"kubernetes.io/projected/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-kube-api-access-x7fr9\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.144748 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-combined-ca-bundle\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.144772 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-config\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.144799 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-ovndb-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.144830 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-internal-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.246885 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7fr9\" (UniqueName: \"kubernetes.io/projected/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-kube-api-access-x7fr9\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.246951 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-combined-ca-bundle\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.247012 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-config\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.247047 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-ovndb-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.247295 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-internal-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.247410 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-public-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.247450 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-httpd-config\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.257377 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-config\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.257494 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-internal-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.257596 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-combined-ca-bundle\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.259181 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-ovndb-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.259299 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-public-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.261416 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-httpd-config\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.271214 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7fr9\" (UniqueName: \"kubernetes.io/projected/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-kube-api-access-x7fr9\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.324476 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.710157 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.100929 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6867c6dbc5-lzgfd"] Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.303693 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6867c6dbc5-lzgfd" event={"ID":"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46","Type":"ContainerStarted","Data":"10129169327e9c40582f9c635a8d87b021f99cc78ac017f7e4f16f40942456bc"} Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.306724 4816 generic.go:334] "Generic (PLEG): container finished" podID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerID="e6365a9b05307ab0d08f2be43b5fd09d02f68def8b266305137ad18cede98778" exitCode=0 Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.306783 4816 generic.go:334] "Generic (PLEG): container finished" podID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerID="cd4f48e197b603f62e0bf20ff6682b81e7b6286b2b2f453bd8e109136a0f4621" exitCode=0 Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.306832 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a8f76a92-4234-474b-bca2-f5d9cbbec8f2","Type":"ContainerDied","Data":"e6365a9b05307ab0d08f2be43b5fd09d02f68def8b266305137ad18cede98778"} Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.306862 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a8f76a92-4234-474b-bca2-f5d9cbbec8f2","Type":"ContainerDied","Data":"cd4f48e197b603f62e0bf20ff6682b81e7b6286b2b2f453bd8e109136a0f4621"} Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.309873 4816 generic.go:334] "Generic (PLEG): container finished" podID="bd930e1b-a508-4a64-8825-9800b8010d59" containerID="ed06a5d04ea24da7b7022266f3b93adfbbc7a80293e5752545ee9f6add12458d" exitCode=0 Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.309931 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64584d7649-mb6k8" event={"ID":"bd930e1b-a508-4a64-8825-9800b8010d59","Type":"ContainerDied","Data":"ed06a5d04ea24da7b7022266f3b93adfbbc7a80293e5752545ee9f6add12458d"} Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.759373 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.785290 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z98gf\" (UniqueName: \"kubernetes.io/projected/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-kube-api-access-z98gf\") pod \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.785357 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-combined-ca-bundle\") pod \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.785385 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-etc-machine-id\") pod \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.785431 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data\") pod \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.785460 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data-custom\") pod \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.785488 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-scripts\") pod \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.791234 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-scripts" (OuterVolumeSpecName: "scripts") pod "a8f76a92-4234-474b-bca2-f5d9cbbec8f2" (UID: "a8f76a92-4234-474b-bca2-f5d9cbbec8f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.791778 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a8f76a92-4234-474b-bca2-f5d9cbbec8f2" (UID: "a8f76a92-4234-474b-bca2-f5d9cbbec8f2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.795831 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-kube-api-access-z98gf" (OuterVolumeSpecName: "kube-api-access-z98gf") pod "a8f76a92-4234-474b-bca2-f5d9cbbec8f2" (UID: "a8f76a92-4234-474b-bca2-f5d9cbbec8f2"). InnerVolumeSpecName "kube-api-access-z98gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.802299 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a8f76a92-4234-474b-bca2-f5d9cbbec8f2" (UID: "a8f76a92-4234-474b-bca2-f5d9cbbec8f2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.878558 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8f76a92-4234-474b-bca2-f5d9cbbec8f2" (UID: "a8f76a92-4234-474b-bca2-f5d9cbbec8f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.887402 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.887452 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.887463 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z98gf\" (UniqueName: \"kubernetes.io/projected/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-kube-api-access-z98gf\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.887476 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.887485 4816 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.924405 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data" (OuterVolumeSpecName: "config-data") pod "a8f76a92-4234-474b-bca2-f5d9cbbec8f2" (UID: "a8f76a92-4234-474b-bca2-f5d9cbbec8f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.989049 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.327724 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6867c6dbc5-lzgfd" event={"ID":"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46","Type":"ContainerStarted","Data":"d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d"} Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.327850 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.327871 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6867c6dbc5-lzgfd" event={"ID":"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46","Type":"ContainerStarted","Data":"fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035"} Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.342550 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a8f76a92-4234-474b-bca2-f5d9cbbec8f2","Type":"ContainerDied","Data":"965649e801e9747f00b8263a5feeac3e050a872f0198fc9e14a40e62b55571cf"} Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.342646 4816 scope.go:117] "RemoveContainer" containerID="e6365a9b05307ab0d08f2be43b5fd09d02f68def8b266305137ad18cede98778" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.342648 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.363557 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6867c6dbc5-lzgfd" podStartSLOduration=3.363537587 podStartE2EDuration="3.363537587s" podCreationTimestamp="2026-03-11 12:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:20.358772151 +0000 UTC m=+1246.950036128" watchObservedRunningTime="2026-03-11 12:19:20.363537587 +0000 UTC m=+1246.954801544" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.364943 4816 generic.go:334] "Generic (PLEG): container finished" podID="bd930e1b-a508-4a64-8825-9800b8010d59" containerID="06ebd4a2da9305c5f9303396efc2a80f0ef4ae2462b9e8b47545883c85f3c658" exitCode=0 Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.365038 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64584d7649-mb6k8" event={"ID":"bd930e1b-a508-4a64-8825-9800b8010d59","Type":"ContainerDied","Data":"06ebd4a2da9305c5f9303396efc2a80f0ef4ae2462b9e8b47545883c85f3c658"} Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.418719 4816 scope.go:117] "RemoveContainer" containerID="cd4f48e197b603f62e0bf20ff6682b81e7b6286b2b2f453bd8e109136a0f4621" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.431939 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.439472 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.466712 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:20 crc kubenswrapper[4816]: E0311 12:19:20.467844 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="cinder-scheduler" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.467871 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="cinder-scheduler" Mar 11 12:19:20 crc kubenswrapper[4816]: E0311 12:19:20.467910 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="probe" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.468419 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="probe" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.469143 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="probe" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.469203 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="cinder-scheduler" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.473032 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.480691 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.491265 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.610238 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.610303 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-scripts\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.610384 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/594ad696-b727-4153-979f-d32ccdc1fe83-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.610404 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.610436 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmj7z\" (UniqueName: \"kubernetes.io/projected/594ad696-b727-4153-979f-d32ccdc1fe83-kube-api-access-bmj7z\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.610474 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.628364 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.711763 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/594ad696-b727-4153-979f-d32ccdc1fe83-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.711809 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.711856 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmj7z\" (UniqueName: \"kubernetes.io/projected/594ad696-b727-4153-979f-d32ccdc1fe83-kube-api-access-bmj7z\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.711898 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.711973 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.711997 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-scripts\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.713201 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/594ad696-b727-4153-979f-d32ccdc1fe83-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.721894 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.725543 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-scripts\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.726657 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.732224 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.732881 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmj7z\" (UniqueName: \"kubernetes.io/projected/594ad696-b727-4153-979f-d32ccdc1fe83-kube-api-access-bmj7z\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.812962 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-internal-tls-certs\") pod \"bd930e1b-a508-4a64-8825-9800b8010d59\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.813046 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-public-tls-certs\") pod \"bd930e1b-a508-4a64-8825-9800b8010d59\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.813065 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6v7x\" (UniqueName: \"kubernetes.io/projected/bd930e1b-a508-4a64-8825-9800b8010d59-kube-api-access-w6v7x\") pod \"bd930e1b-a508-4a64-8825-9800b8010d59\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.813136 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-combined-ca-bundle\") pod \"bd930e1b-a508-4a64-8825-9800b8010d59\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.813168 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-config\") pod \"bd930e1b-a508-4a64-8825-9800b8010d59\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.813190 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-ovndb-tls-certs\") pod \"bd930e1b-a508-4a64-8825-9800b8010d59\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.813322 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-httpd-config\") pod \"bd930e1b-a508-4a64-8825-9800b8010d59\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.823482 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "bd930e1b-a508-4a64-8825-9800b8010d59" (UID: "bd930e1b-a508-4a64-8825-9800b8010d59"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.843468 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd930e1b-a508-4a64-8825-9800b8010d59-kube-api-access-w6v7x" (OuterVolumeSpecName: "kube-api-access-w6v7x") pod "bd930e1b-a508-4a64-8825-9800b8010d59" (UID: "bd930e1b-a508-4a64-8825-9800b8010d59"). InnerVolumeSpecName "kube-api-access-w6v7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.892497 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd930e1b-a508-4a64-8825-9800b8010d59" (UID: "bd930e1b-a508-4a64-8825-9800b8010d59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.897981 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-config" (OuterVolumeSpecName: "config") pod "bd930e1b-a508-4a64-8825-9800b8010d59" (UID: "bd930e1b-a508-4a64-8825-9800b8010d59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.909533 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bd930e1b-a508-4a64-8825-9800b8010d59" (UID: "bd930e1b-a508-4a64-8825-9800b8010d59"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.910297 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bd930e1b-a508-4a64-8825-9800b8010d59" (UID: "bd930e1b-a508-4a64-8825-9800b8010d59"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.916103 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.916142 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.916155 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6v7x\" (UniqueName: \"kubernetes.io/projected/bd930e1b-a508-4a64-8825-9800b8010d59-kube-api-access-w6v7x\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.916168 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.916181 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.916193 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.917078 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.918726 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "bd930e1b-a508-4a64-8825-9800b8010d59" (UID: "bd930e1b-a508-4a64-8825-9800b8010d59"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.018392 4816 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.379385 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64584d7649-mb6k8" event={"ID":"bd930e1b-a508-4a64-8825-9800b8010d59","Type":"ContainerDied","Data":"0829aed87a841d8c87b4f741cc407293d8e591d9e8b4c02e21e8a61c30445d1f"} Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.379545 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.379716 4816 scope.go:117] "RemoveContainer" containerID="ed06a5d04ea24da7b7022266f3b93adfbbc7a80293e5752545ee9f6add12458d" Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.415154 4816 scope.go:117] "RemoveContainer" containerID="06ebd4a2da9305c5f9303396efc2a80f0ef4ae2462b9e8b47545883c85f3c658" Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.417347 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-64584d7649-mb6k8"] Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.429828 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-64584d7649-mb6k8"] Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.502190 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.150628 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" path="/var/lib/kubelet/pods/a8f76a92-4234-474b-bca2-f5d9cbbec8f2/volumes" Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.152130 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" path="/var/lib/kubelet/pods/bd930e1b-a508-4a64-8825-9800b8010d59/volumes" Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.401613 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"594ad696-b727-4153-979f-d32ccdc1fe83","Type":"ContainerStarted","Data":"b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac"} Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.401666 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"594ad696-b727-4153-979f-d32ccdc1fe83","Type":"ContainerStarted","Data":"883c96453eeb3dc398341c2c3b80a740484d91dd773b0fcfe0237a4112b6097a"} Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.608312 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.666730 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.743355 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d4754df76-xnl78"] Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.743623 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5d4754df76-xnl78" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api-log" containerID="cri-o://5ba8a8c2543ebb94e1b68f6aeb2566f2e416672e55badbfef9432d4a75b3a2bf" gracePeriod=30 Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.744272 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5d4754df76-xnl78" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api" containerID="cri-o://eaee8f2b001ecac77cd66b481deeba2cae3b59ceedc01017e976649a89d1fa8d" gracePeriod=30 Mar 11 12:19:23 crc kubenswrapper[4816]: I0311 12:19:23.430942 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"594ad696-b727-4153-979f-d32ccdc1fe83","Type":"ContainerStarted","Data":"4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8"} Mar 11 12:19:23 crc kubenswrapper[4816]: I0311 12:19:23.442151 4816 generic.go:334] "Generic (PLEG): container finished" podID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerID="5ba8a8c2543ebb94e1b68f6aeb2566f2e416672e55badbfef9432d4a75b3a2bf" exitCode=143 Mar 11 12:19:23 crc kubenswrapper[4816]: I0311 12:19:23.442223 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d4754df76-xnl78" event={"ID":"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd","Type":"ContainerDied","Data":"5ba8a8c2543ebb94e1b68f6aeb2566f2e416672e55badbfef9432d4a75b3a2bf"} Mar 11 12:19:23 crc kubenswrapper[4816]: I0311 12:19:23.460202 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.460170239 podStartE2EDuration="3.460170239s" podCreationTimestamp="2026-03-11 12:19:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:23.447810946 +0000 UTC m=+1250.039074933" watchObservedRunningTime="2026-03-11 12:19:23.460170239 +0000 UTC m=+1250.051434196" Mar 11 12:19:25 crc kubenswrapper[4816]: I0311 12:19:25.917949 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.178182 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5d4754df76-xnl78" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:42518->10.217.0.161:9311: read: connection reset by peer" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.178198 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5d4754df76-xnl78" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:42516->10.217.0.161:9311: read: connection reset by peer" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.519597 4816 generic.go:334] "Generic (PLEG): container finished" podID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerID="eaee8f2b001ecac77cd66b481deeba2cae3b59ceedc01017e976649a89d1fa8d" exitCode=0 Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.519667 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d4754df76-xnl78" event={"ID":"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd","Type":"ContainerDied","Data":"eaee8f2b001ecac77cd66b481deeba2cae3b59ceedc01017e976649a89d1fa8d"} Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.540093 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.727724 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.767053 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-logs\") pod \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.767105 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data-custom\") pod \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.767141 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data\") pod \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.767204 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-combined-ca-bundle\") pod \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.767714 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-logs" (OuterVolumeSpecName: "logs") pod "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" (UID: "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.768111 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzbbt\" (UniqueName: \"kubernetes.io/projected/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-kube-api-access-fzbbt\") pod \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.768420 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.791130 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-kube-api-access-fzbbt" (OuterVolumeSpecName: "kube-api-access-fzbbt") pod "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" (UID: "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd"). InnerVolumeSpecName "kube-api-access-fzbbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.791339 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" (UID: "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.820785 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" (UID: "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.829767 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data" (OuterVolumeSpecName: "config-data") pod "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" (UID: "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.870208 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.870236 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzbbt\" (UniqueName: \"kubernetes.io/projected/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-kube-api-access-fzbbt\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.870267 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.870278 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:27 crc kubenswrapper[4816]: I0311 12:19:27.530201 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d4754df76-xnl78" event={"ID":"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd","Type":"ContainerDied","Data":"9cc4c282c9e0a53abd8b5254615b71e35bd7cba821c5895a1166b86769ee9a4f"} Mar 11 12:19:27 crc kubenswrapper[4816]: I0311 12:19:27.530339 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:27 crc kubenswrapper[4816]: I0311 12:19:27.531222 4816 scope.go:117] "RemoveContainer" containerID="eaee8f2b001ecac77cd66b481deeba2cae3b59ceedc01017e976649a89d1fa8d" Mar 11 12:19:27 crc kubenswrapper[4816]: I0311 12:19:27.570057 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d4754df76-xnl78"] Mar 11 12:19:27 crc kubenswrapper[4816]: I0311 12:19:27.573767 4816 scope.go:117] "RemoveContainer" containerID="5ba8a8c2543ebb94e1b68f6aeb2566f2e416672e55badbfef9432d4a75b3a2bf" Mar 11 12:19:27 crc kubenswrapper[4816]: I0311 12:19:27.581165 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5d4754df76-xnl78"] Mar 11 12:19:28 crc kubenswrapper[4816]: I0311 12:19:28.155418 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" path="/var/lib/kubelet/pods/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd/volumes" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.110987 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 11 12:19:30 crc kubenswrapper[4816]: E0311 12:19:30.111900 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-api" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.111920 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-api" Mar 11 12:19:30 crc kubenswrapper[4816]: E0311 12:19:30.111944 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-httpd" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.111954 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-httpd" Mar 11 12:19:30 crc kubenswrapper[4816]: E0311 12:19:30.111968 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.111975 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api" Mar 11 12:19:30 crc kubenswrapper[4816]: E0311 12:19:30.112011 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api-log" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.112020 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api-log" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.112241 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-httpd" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.112284 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api-log" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.112298 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.112320 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-api" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.113099 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.117719 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.117858 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-rwjj4" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.117972 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.127066 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.239979 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.240106 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhzf2\" (UniqueName: \"kubernetes.io/projected/502b3843-8246-4715-9735-dfc0336caacb-kube-api-access-xhzf2\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.240241 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/502b3843-8246-4715-9735-dfc0336caacb-openstack-config\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.240378 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-openstack-config-secret\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.342747 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-openstack-config-secret\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.342897 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.342959 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhzf2\" (UniqueName: \"kubernetes.io/projected/502b3843-8246-4715-9735-dfc0336caacb-kube-api-access-xhzf2\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.343050 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/502b3843-8246-4715-9735-dfc0336caacb-openstack-config\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.344074 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/502b3843-8246-4715-9735-dfc0336caacb-openstack-config\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.351138 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-openstack-config-secret\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.351332 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.362512 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhzf2\" (UniqueName: \"kubernetes.io/projected/502b3843-8246-4715-9735-dfc0336caacb-kube-api-access-xhzf2\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.447080 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.963061 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 11 12:19:31 crc kubenswrapper[4816]: I0311 12:19:31.207937 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 11 12:19:31 crc kubenswrapper[4816]: I0311 12:19:31.591001 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"502b3843-8246-4715-9735-dfc0336caacb","Type":"ContainerStarted","Data":"8e4a3fdbe3614d064cc8bdff8752cfb65321a17270a649131b41201fcc4fda91"} Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.671663 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6c5b6658f-tdgsh"] Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.675230 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.676932 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.680007 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.680038 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.691110 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6c5b6658f-tdgsh"] Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.750973 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-etc-swift\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.751031 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-log-httpd\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.751061 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-run-httpd\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.751092 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.751162 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-config-data\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.751200 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddtnh\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-kube-api-access-ddtnh\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.751241 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-combined-ca-bundle\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.751275 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-public-tls-certs\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853003 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddtnh\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-kube-api-access-ddtnh\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853082 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-combined-ca-bundle\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853106 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-public-tls-certs\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853163 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-etc-swift\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853191 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-log-httpd\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853214 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-run-httpd\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853283 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853349 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-config-data\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.855218 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-log-httpd\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.855679 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-run-httpd\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.861726 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-config-data\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.862840 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-etc-swift\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.863311 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.863510 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-combined-ca-bundle\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.868130 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-public-tls-certs\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.871557 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddtnh\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-kube-api-access-ddtnh\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.951147 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:35 crc kubenswrapper[4816]: I0311 12:19:35.003465 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:35 crc kubenswrapper[4816]: I0311 12:19:35.690879 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6c5b6658f-tdgsh"] Mar 11 12:19:36 crc kubenswrapper[4816]: I0311 12:19:36.598805 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:36 crc kubenswrapper[4816]: I0311 12:19:36.599552 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-central-agent" containerID="cri-o://64d1dc2db1be47dc15a33d606bf556173a42132151a3b69e28ce73757040e831" gracePeriod=30 Mar 11 12:19:36 crc kubenswrapper[4816]: I0311 12:19:36.599674 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="sg-core" containerID="cri-o://892ed54ab6b1b8e78f2c10457a1ac792f459dfcc72db435ed64164634c50c4f4" gracePeriod=30 Mar 11 12:19:36 crc kubenswrapper[4816]: I0311 12:19:36.599674 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="proxy-httpd" containerID="cri-o://b0f2ba98772ce0d4c1de918f6b5eca0c46d92b8201207a117fc19c82c71e70f3" gracePeriod=30 Mar 11 12:19:36 crc kubenswrapper[4816]: I0311 12:19:36.599692 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-notification-agent" containerID="cri-o://5ea55c5fdec26a804e311808a0dab722dc704515cc19343dfae8f51e1980dcdf" gracePeriod=30 Mar 11 12:19:36 crc kubenswrapper[4816]: I0311 12:19:36.614894 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.167:3000/\": EOF" Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660165 4816 generic.go:334] "Generic (PLEG): container finished" podID="10e3f184-9109-4af7-8ca6-822379e0c513" containerID="b0f2ba98772ce0d4c1de918f6b5eca0c46d92b8201207a117fc19c82c71e70f3" exitCode=0 Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660480 4816 generic.go:334] "Generic (PLEG): container finished" podID="10e3f184-9109-4af7-8ca6-822379e0c513" containerID="892ed54ab6b1b8e78f2c10457a1ac792f459dfcc72db435ed64164634c50c4f4" exitCode=2 Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660490 4816 generic.go:334] "Generic (PLEG): container finished" podID="10e3f184-9109-4af7-8ca6-822379e0c513" containerID="5ea55c5fdec26a804e311808a0dab722dc704515cc19343dfae8f51e1980dcdf" exitCode=0 Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660499 4816 generic.go:334] "Generic (PLEG): container finished" podID="10e3f184-9109-4af7-8ca6-822379e0c513" containerID="64d1dc2db1be47dc15a33d606bf556173a42132151a3b69e28ce73757040e831" exitCode=0 Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660265 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerDied","Data":"b0f2ba98772ce0d4c1de918f6b5eca0c46d92b8201207a117fc19c82c71e70f3"} Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660546 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerDied","Data":"892ed54ab6b1b8e78f2c10457a1ac792f459dfcc72db435ed64164634c50c4f4"} Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660565 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerDied","Data":"5ea55c5fdec26a804e311808a0dab722dc704515cc19343dfae8f51e1980dcdf"} Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660575 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerDied","Data":"64d1dc2db1be47dc15a33d606bf556173a42132151a3b69e28ce73757040e831"} Mar 11 12:19:40 crc kubenswrapper[4816]: I0311 12:19:40.694332 4816 generic.go:334] "Generic (PLEG): container finished" podID="43eac2c3-bace-4682-b48e-f063d6653733" containerID="0dc3816fea03c51cbbb58023865a3dee996cbbc76475be49172b8d011f579193" exitCode=137 Mar 11 12:19:40 crc kubenswrapper[4816]: I0311 12:19:40.694957 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43eac2c3-bace-4682-b48e-f063d6653733","Type":"ContainerDied","Data":"0dc3816fea03c51cbbb58023865a3dee996cbbc76475be49172b8d011f579193"} Mar 11 12:19:40 crc kubenswrapper[4816]: I0311 12:19:40.739366 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.165:8776/healthcheck\": dial tcp 10.217.0.165:8776: connect: connection refused" Mar 11 12:19:41 crc kubenswrapper[4816]: W0311 12:19:41.442868 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e6d90d2_e7e3_4245_b3a6_042621e01a67.slice/crio-78085f7a145fe8f236757523ada7ae443e7b3ab85638d5063fc54c7855365882 WatchSource:0}: Error finding container 78085f7a145fe8f236757523ada7ae443e7b3ab85638d5063fc54c7855365882: Status 404 returned error can't find the container with id 78085f7a145fe8f236757523ada7ae443e7b3ab85638d5063fc54c7855365882 Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.518198 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.167:3000/\": dial tcp 10.217.0.167:3000: connect: connection refused" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.713596 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c5b6658f-tdgsh" event={"ID":"3e6d90d2-e7e3-4245-b3a6-042621e01a67","Type":"ContainerStarted","Data":"78085f7a145fe8f236757523ada7ae443e7b3ab85638d5063fc54c7855365882"} Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.809313 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.938478 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-combined-ca-bundle\") pod \"43eac2c3-bace-4682-b48e-f063d6653733\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.938645 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data-custom\") pod \"43eac2c3-bace-4682-b48e-f063d6653733\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.938753 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43eac2c3-bace-4682-b48e-f063d6653733-etc-machine-id\") pod \"43eac2c3-bace-4682-b48e-f063d6653733\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.938902 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwzct\" (UniqueName: \"kubernetes.io/projected/43eac2c3-bace-4682-b48e-f063d6653733-kube-api-access-vwzct\") pod \"43eac2c3-bace-4682-b48e-f063d6653733\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.938996 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43eac2c3-bace-4682-b48e-f063d6653733-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "43eac2c3-bace-4682-b48e-f063d6653733" (UID: "43eac2c3-bace-4682-b48e-f063d6653733"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.939114 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-scripts\") pod \"43eac2c3-bace-4682-b48e-f063d6653733\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.940556 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data\") pod \"43eac2c3-bace-4682-b48e-f063d6653733\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.940717 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eac2c3-bace-4682-b48e-f063d6653733-logs\") pod \"43eac2c3-bace-4682-b48e-f063d6653733\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.942062 4816 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43eac2c3-bace-4682-b48e-f063d6653733-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.943096 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43eac2c3-bace-4682-b48e-f063d6653733-logs" (OuterVolumeSpecName: "logs") pod "43eac2c3-bace-4682-b48e-f063d6653733" (UID: "43eac2c3-bace-4682-b48e-f063d6653733"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.946197 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "43eac2c3-bace-4682-b48e-f063d6653733" (UID: "43eac2c3-bace-4682-b48e-f063d6653733"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.946262 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43eac2c3-bace-4682-b48e-f063d6653733-kube-api-access-vwzct" (OuterVolumeSpecName: "kube-api-access-vwzct") pod "43eac2c3-bace-4682-b48e-f063d6653733" (UID: "43eac2c3-bace-4682-b48e-f063d6653733"). InnerVolumeSpecName "kube-api-access-vwzct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.954678 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-scripts" (OuterVolumeSpecName: "scripts") pod "43eac2c3-bace-4682-b48e-f063d6653733" (UID: "43eac2c3-bace-4682-b48e-f063d6653733"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.960019 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.980434 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43eac2c3-bace-4682-b48e-f063d6653733" (UID: "43eac2c3-bace-4682-b48e-f063d6653733"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.021771 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data" (OuterVolumeSpecName: "config-data") pod "43eac2c3-bace-4682-b48e-f063d6653733" (UID: "43eac2c3-bace-4682-b48e-f063d6653733"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.043903 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.043945 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.043956 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwzct\" (UniqueName: \"kubernetes.io/projected/43eac2c3-bace-4682-b48e-f063d6653733-kube-api-access-vwzct\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.043972 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.043982 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.043995 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eac2c3-bace-4682-b48e-f063d6653733-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.145606 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-log-httpd\") pod \"10e3f184-9109-4af7-8ca6-822379e0c513\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.145712 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-run-httpd\") pod \"10e3f184-9109-4af7-8ca6-822379e0c513\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.145804 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsbml\" (UniqueName: \"kubernetes.io/projected/10e3f184-9109-4af7-8ca6-822379e0c513-kube-api-access-hsbml\") pod \"10e3f184-9109-4af7-8ca6-822379e0c513\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.145965 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-config-data\") pod \"10e3f184-9109-4af7-8ca6-822379e0c513\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.146013 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-sg-core-conf-yaml\") pod \"10e3f184-9109-4af7-8ca6-822379e0c513\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.146043 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-scripts\") pod \"10e3f184-9109-4af7-8ca6-822379e0c513\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.146107 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-combined-ca-bundle\") pod \"10e3f184-9109-4af7-8ca6-822379e0c513\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.147906 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "10e3f184-9109-4af7-8ca6-822379e0c513" (UID: "10e3f184-9109-4af7-8ca6-822379e0c513"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.151496 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10e3f184-9109-4af7-8ca6-822379e0c513-kube-api-access-hsbml" (OuterVolumeSpecName: "kube-api-access-hsbml") pod "10e3f184-9109-4af7-8ca6-822379e0c513" (UID: "10e3f184-9109-4af7-8ca6-822379e0c513"). InnerVolumeSpecName "kube-api-access-hsbml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.154184 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "10e3f184-9109-4af7-8ca6-822379e0c513" (UID: "10e3f184-9109-4af7-8ca6-822379e0c513"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.155362 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-scripts" (OuterVolumeSpecName: "scripts") pod "10e3f184-9109-4af7-8ca6-822379e0c513" (UID: "10e3f184-9109-4af7-8ca6-822379e0c513"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.196433 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "10e3f184-9109-4af7-8ca6-822379e0c513" (UID: "10e3f184-9109-4af7-8ca6-822379e0c513"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.230791 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10e3f184-9109-4af7-8ca6-822379e0c513" (UID: "10e3f184-9109-4af7-8ca6-822379e0c513"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.248313 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.248356 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.248369 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsbml\" (UniqueName: \"kubernetes.io/projected/10e3f184-9109-4af7-8ca6-822379e0c513-kube-api-access-hsbml\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.248384 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.248394 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.248406 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.255797 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-config-data" (OuterVolumeSpecName: "config-data") pod "10e3f184-9109-4af7-8ca6-822379e0c513" (UID: "10e3f184-9109-4af7-8ca6-822379e0c513"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.350750 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.732982 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c5b6658f-tdgsh" event={"ID":"3e6d90d2-e7e3-4245-b3a6-042621e01a67","Type":"ContainerStarted","Data":"526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000"} Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.733054 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c5b6658f-tdgsh" event={"ID":"3e6d90d2-e7e3-4245-b3a6-042621e01a67","Type":"ContainerStarted","Data":"ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff"} Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.733111 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.733140 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.738658 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43eac2c3-bace-4682-b48e-f063d6653733","Type":"ContainerDied","Data":"d1d101cb43433bc7eb7c833f258e91530ee7e5c09a0712cf4851d690643adb2a"} Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.738722 4816 scope.go:117] "RemoveContainer" containerID="0dc3816fea03c51cbbb58023865a3dee996cbbc76475be49172b8d011f579193" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.738929 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.753419 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"502b3843-8246-4715-9735-dfc0336caacb","Type":"ContainerStarted","Data":"fd6533a10f6d22b4d1d7a2a73ad8cc4591438b77aefeced48dbf3b4526cf28f0"} Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.758743 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerDied","Data":"6d281821384131e14e507eeaf976f8558feb01527e87cd3779946b65388e3bc7"} Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.758785 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.767992 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6c5b6658f-tdgsh" podStartSLOduration=8.767969468 podStartE2EDuration="8.767969468s" podCreationTimestamp="2026-03-11 12:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:42.762115191 +0000 UTC m=+1269.353379158" watchObservedRunningTime="2026-03-11 12:19:42.767969468 +0000 UTC m=+1269.359233435" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.783706 4816 scope.go:117] "RemoveContainer" containerID="346170c8c6b811872540539f7b2570fc326b6427186b6e7d7e167645153015dd" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.833738 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.250539474 podStartE2EDuration="12.833714517s" podCreationTimestamp="2026-03-11 12:19:30 +0000 UTC" firstStartedPulling="2026-03-11 12:19:30.970530209 +0000 UTC m=+1257.561794176" lastFinishedPulling="2026-03-11 12:19:41.553705252 +0000 UTC m=+1268.144969219" observedRunningTime="2026-03-11 12:19:42.795886106 +0000 UTC m=+1269.387150113" watchObservedRunningTime="2026-03-11 12:19:42.833714517 +0000 UTC m=+1269.424978484" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.841764 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.854242 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.857747 4816 scope.go:117] "RemoveContainer" containerID="b0f2ba98772ce0d4c1de918f6b5eca0c46d92b8201207a117fc19c82c71e70f3" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.873524 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:42 crc kubenswrapper[4816]: E0311 12:19:42.874078 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api-log" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874102 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api-log" Mar 11 12:19:42 crc kubenswrapper[4816]: E0311 12:19:42.874114 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874122 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api" Mar 11 12:19:42 crc kubenswrapper[4816]: E0311 12:19:42.874166 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-notification-agent" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874173 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-notification-agent" Mar 11 12:19:42 crc kubenswrapper[4816]: E0311 12:19:42.874184 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-central-agent" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874190 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-central-agent" Mar 11 12:19:42 crc kubenswrapper[4816]: E0311 12:19:42.874200 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="sg-core" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874207 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="sg-core" Mar 11 12:19:42 crc kubenswrapper[4816]: E0311 12:19:42.874223 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="proxy-httpd" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874230 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="proxy-httpd" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874431 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874446 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="proxy-httpd" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874457 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="sg-core" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874468 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-notification-agent" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874480 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-central-agent" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874491 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api-log" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.875624 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.877861 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.878030 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.880749 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.896771 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.906338 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.917901 4816 scope.go:117] "RemoveContainer" containerID="892ed54ab6b1b8e78f2c10457a1ac792f459dfcc72db435ed64164634c50c4f4" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.954657 4816 scope.go:117] "RemoveContainer" containerID="5ea55c5fdec26a804e311808a0dab722dc704515cc19343dfae8f51e1980dcdf" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.960731 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964325 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-scripts\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964470 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data-custom\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964502 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964573 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964627 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c94c19c-3ccb-43cc-ab41-92baa3141f73-logs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964730 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c94c19c-3ccb-43cc-ab41-92baa3141f73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964767 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964829 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964859 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5svs\" (UniqueName: \"kubernetes.io/projected/1c94c19c-3ccb-43cc-ab41-92baa3141f73-kube-api-access-g5svs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.979401 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.994812 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.995325 4816 scope.go:117] "RemoveContainer" containerID="64d1dc2db1be47dc15a33d606bf556173a42132151a3b69e28ce73757040e831" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.998161 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:42.998846 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.002529 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069107 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data-custom\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069218 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069446 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069470 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c94c19c-3ccb-43cc-ab41-92baa3141f73-logs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069514 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c94c19c-3ccb-43cc-ab41-92baa3141f73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069539 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069556 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069583 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5svs\" (UniqueName: \"kubernetes.io/projected/1c94c19c-3ccb-43cc-ab41-92baa3141f73-kube-api-access-g5svs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069668 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-scripts\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.070313 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c94c19c-3ccb-43cc-ab41-92baa3141f73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.070687 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c94c19c-3ccb-43cc-ab41-92baa3141f73-logs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.077068 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-scripts\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.079546 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.093329 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data-custom\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.102341 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.102696 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.103553 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5svs\" (UniqueName: \"kubernetes.io/projected/1c94c19c-3ccb-43cc-ab41-92baa3141f73-kube-api-access-g5svs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.103635 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.174946 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-run-httpd\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.175026 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-scripts\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.175125 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.175156 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-config-data\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.175216 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-log-httpd\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.175283 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh28p\" (UniqueName: \"kubernetes.io/projected/fe1c6061-c54b-4bd7-bcff-1a0047599189-kube-api-access-gh28p\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.175313 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.203485 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.276647 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-run-httpd\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.276691 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-scripts\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.276767 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.276783 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-config-data\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.276822 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-log-httpd\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.276854 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh28p\" (UniqueName: \"kubernetes.io/projected/fe1c6061-c54b-4bd7-bcff-1a0047599189-kube-api-access-gh28p\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.276874 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.281981 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-run-httpd\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.282332 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-log-httpd\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.282402 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.287870 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-config-data\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.288287 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.288616 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-scripts\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.312083 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh28p\" (UniqueName: \"kubernetes.io/projected/fe1c6061-c54b-4bd7-bcff-1a0047599189-kube-api-access-gh28p\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.312644 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.837951 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:43 crc kubenswrapper[4816]: W0311 12:19:43.849490 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c94c19c_3ccb_43cc_ab41_92baa3141f73.slice/crio-601d8bfb1ac6479d4e58832dfee18035d25eae3e88360d11ef1513118c0bd2f3 WatchSource:0}: Error finding container 601d8bfb1ac6479d4e58832dfee18035d25eae3e88360d11ef1513118c0bd2f3: Status 404 returned error can't find the container with id 601d8bfb1ac6479d4e58832dfee18035d25eae3e88360d11ef1513118c0bd2f3 Mar 11 12:19:43 crc kubenswrapper[4816]: W0311 12:19:43.949670 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe1c6061_c54b_4bd7_bcff_1a0047599189.slice/crio-bf43034a6d989e03fbb9afda66ffef8a89a7703f9ab28abf0ae391e957eb6554 WatchSource:0}: Error finding container bf43034a6d989e03fbb9afda66ffef8a89a7703f9ab28abf0ae391e957eb6554: Status 404 returned error can't find the container with id bf43034a6d989e03fbb9afda66ffef8a89a7703f9ab28abf0ae391e957eb6554 Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.951791 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.101461 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-zv62x"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.102757 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.115699 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zv62x"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.186540 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" path="/var/lib/kubelet/pods/10e3f184-9109-4af7-8ca6-822379e0c513/volumes" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.188052 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43eac2c3-bace-4682-b48e-f063d6653733" path="/var/lib/kubelet/pods/43eac2c3-bace-4682-b48e-f063d6653733/volumes" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.201030 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e0ff63-3d12-4174-9341-ceb21109e000-operator-scripts\") pod \"nova-api-db-create-zv62x\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.201104 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpzw2\" (UniqueName: \"kubernetes.io/projected/a0e0ff63-3d12-4174-9341-ceb21109e000-kube-api-access-lpzw2\") pod \"nova-api-db-create-zv62x\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.245549 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-4z7mr"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.247179 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.258187 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4z7mr"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.303424 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e0ff63-3d12-4174-9341-ceb21109e000-operator-scripts\") pod \"nova-api-db-create-zv62x\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.303483 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpzw2\" (UniqueName: \"kubernetes.io/projected/a0e0ff63-3d12-4174-9341-ceb21109e000-kube-api-access-lpzw2\") pod \"nova-api-db-create-zv62x\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.305330 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e0ff63-3d12-4174-9341-ceb21109e000-operator-scripts\") pod \"nova-api-db-create-zv62x\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.312081 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-91ce-account-create-update-n8mz8"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.313497 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.319232 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.332165 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpzw2\" (UniqueName: \"kubernetes.io/projected/a0e0ff63-3d12-4174-9341-ceb21109e000-kube-api-access-lpzw2\") pod \"nova-api-db-create-zv62x\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.334683 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-91ce-account-create-update-n8mz8"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.406133 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec4faaf-e219-4b01-b3b9-0d6757a38154-operator-scripts\") pod \"nova-cell0-db-create-4z7mr\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.406188 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35fe8af0-2f02-4d81-ae03-9d399900494c-operator-scripts\") pod \"nova-api-91ce-account-create-update-n8mz8\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.406229 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-952tt\" (UniqueName: \"kubernetes.io/projected/35fe8af0-2f02-4d81-ae03-9d399900494c-kube-api-access-952tt\") pod \"nova-api-91ce-account-create-update-n8mz8\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.406277 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrnr5\" (UniqueName: \"kubernetes.io/projected/1ec4faaf-e219-4b01-b3b9-0d6757a38154-kube-api-access-vrnr5\") pod \"nova-cell0-db-create-4z7mr\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.432403 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-txccq"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.434001 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.448095 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-txccq"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.459898 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.508055 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec4faaf-e219-4b01-b3b9-0d6757a38154-operator-scripts\") pod \"nova-cell0-db-create-4z7mr\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.508124 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35fe8af0-2f02-4d81-ae03-9d399900494c-operator-scripts\") pod \"nova-api-91ce-account-create-update-n8mz8\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.508170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-952tt\" (UniqueName: \"kubernetes.io/projected/35fe8af0-2f02-4d81-ae03-9d399900494c-kube-api-access-952tt\") pod \"nova-api-91ce-account-create-update-n8mz8\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.508203 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrnr5\" (UniqueName: \"kubernetes.io/projected/1ec4faaf-e219-4b01-b3b9-0d6757a38154-kube-api-access-vrnr5\") pod \"nova-cell0-db-create-4z7mr\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.508238 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c9952da-6281-45f2-8b45-30caa27b8d39-operator-scripts\") pod \"nova-cell1-db-create-txccq\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.508319 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb6tl\" (UniqueName: \"kubernetes.io/projected/7c9952da-6281-45f2-8b45-30caa27b8d39-kube-api-access-sb6tl\") pod \"nova-cell1-db-create-txccq\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.512621 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec4faaf-e219-4b01-b3b9-0d6757a38154-operator-scripts\") pod \"nova-cell0-db-create-4z7mr\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.513534 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35fe8af0-2f02-4d81-ae03-9d399900494c-operator-scripts\") pod \"nova-api-91ce-account-create-update-n8mz8\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.525616 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-53ba-account-create-update-2vf2k"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.527275 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.532058 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.543162 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrnr5\" (UniqueName: \"kubernetes.io/projected/1ec4faaf-e219-4b01-b3b9-0d6757a38154-kube-api-access-vrnr5\") pod \"nova-cell0-db-create-4z7mr\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.545500 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-952tt\" (UniqueName: \"kubernetes.io/projected/35fe8af0-2f02-4d81-ae03-9d399900494c-kube-api-access-952tt\") pod \"nova-api-91ce-account-create-update-n8mz8\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.572181 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-53ba-account-create-update-2vf2k"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.579387 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.610149 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c9952da-6281-45f2-8b45-30caa27b8d39-operator-scripts\") pod \"nova-cell1-db-create-txccq\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.610216 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-operator-scripts\") pod \"nova-cell0-53ba-account-create-update-2vf2k\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.610277 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb6tl\" (UniqueName: \"kubernetes.io/projected/7c9952da-6281-45f2-8b45-30caa27b8d39-kube-api-access-sb6tl\") pod \"nova-cell1-db-create-txccq\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.610335 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjhm8\" (UniqueName: \"kubernetes.io/projected/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-kube-api-access-zjhm8\") pod \"nova-cell0-53ba-account-create-update-2vf2k\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.610998 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c9952da-6281-45f2-8b45-30caa27b8d39-operator-scripts\") pod \"nova-cell1-db-create-txccq\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.640850 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.641873 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb6tl\" (UniqueName: \"kubernetes.io/projected/7c9952da-6281-45f2-8b45-30caa27b8d39-kube-api-access-sb6tl\") pod \"nova-cell1-db-create-txccq\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.714173 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjhm8\" (UniqueName: \"kubernetes.io/projected/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-kube-api-access-zjhm8\") pod \"nova-cell0-53ba-account-create-update-2vf2k\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.714318 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-operator-scripts\") pod \"nova-cell0-53ba-account-create-update-2vf2k\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.715449 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-operator-scripts\") pod \"nova-cell0-53ba-account-create-update-2vf2k\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.730457 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-377b-account-create-update-gb4b2"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.731929 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.743002 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.746444 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-377b-account-create-update-gb4b2"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.761849 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.784953 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjhm8\" (UniqueName: \"kubernetes.io/projected/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-kube-api-access-zjhm8\") pod \"nova-cell0-53ba-account-create-update-2vf2k\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.817279 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwq9l\" (UniqueName: \"kubernetes.io/projected/403fec7f-c194-4bdd-a620-34aefb5d677c-kube-api-access-dwq9l\") pod \"nova-cell1-377b-account-create-update-gb4b2\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.817415 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403fec7f-c194-4bdd-a620-34aefb5d677c-operator-scripts\") pod \"nova-cell1-377b-account-create-update-gb4b2\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.835274 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c94c19c-3ccb-43cc-ab41-92baa3141f73","Type":"ContainerStarted","Data":"601d8bfb1ac6479d4e58832dfee18035d25eae3e88360d11ef1513118c0bd2f3"} Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.860114 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerStarted","Data":"bf43034a6d989e03fbb9afda66ffef8a89a7703f9ab28abf0ae391e957eb6554"} Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.919195 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403fec7f-c194-4bdd-a620-34aefb5d677c-operator-scripts\") pod \"nova-cell1-377b-account-create-update-gb4b2\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.919386 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwq9l\" (UniqueName: \"kubernetes.io/projected/403fec7f-c194-4bdd-a620-34aefb5d677c-kube-api-access-dwq9l\") pod \"nova-cell1-377b-account-create-update-gb4b2\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.920568 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403fec7f-c194-4bdd-a620-34aefb5d677c-operator-scripts\") pod \"nova-cell1-377b-account-create-update-gb4b2\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.921155 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.943501 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwq9l\" (UniqueName: \"kubernetes.io/projected/403fec7f-c194-4bdd-a620-34aefb5d677c-kube-api-access-dwq9l\") pod \"nova-cell1-377b-account-create-update-gb4b2\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.125475 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.130183 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zv62x"] Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.286688 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4z7mr"] Mar 11 12:19:45 crc kubenswrapper[4816]: W0311 12:19:45.305952 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ec4faaf_e219_4b01_b3b9_0d6757a38154.slice/crio-2f56abcf45077723001cd758c3b00fda8dcf2b28cf4f89c65baa6a6b4cfb7a38 WatchSource:0}: Error finding container 2f56abcf45077723001cd758c3b00fda8dcf2b28cf4f89c65baa6a6b4cfb7a38: Status 404 returned error can't find the container with id 2f56abcf45077723001cd758c3b00fda8dcf2b28cf4f89c65baa6a6b4cfb7a38 Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.359509 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-91ce-account-create-update-n8mz8"] Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.454891 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-txccq"] Mar 11 12:19:45 crc kubenswrapper[4816]: W0311 12:19:45.585427 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c9952da_6281_45f2_8b45_30caa27b8d39.slice/crio-98ee6dc5ae584921adf996e86778c6141a6b8e7df5c376e7181885193ecb1399 WatchSource:0}: Error finding container 98ee6dc5ae584921adf996e86778c6141a6b8e7df5c376e7181885193ecb1399: Status 404 returned error can't find the container with id 98ee6dc5ae584921adf996e86778c6141a6b8e7df5c376e7181885193ecb1399 Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.631959 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-53ba-account-create-update-2vf2k"] Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.744548 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-377b-account-create-update-gb4b2"] Mar 11 12:19:45 crc kubenswrapper[4816]: W0311 12:19:45.776154 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod403fec7f_c194_4bdd_a620_34aefb5d677c.slice/crio-0429ff5f884160565a9cda58cf166816738777ca43ea5e47ebac9e8d47d354ad WatchSource:0}: Error finding container 0429ff5f884160565a9cda58cf166816738777ca43ea5e47ebac9e8d47d354ad: Status 404 returned error can't find the container with id 0429ff5f884160565a9cda58cf166816738777ca43ea5e47ebac9e8d47d354ad Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.883557 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-377b-account-create-update-gb4b2" event={"ID":"403fec7f-c194-4bdd-a620-34aefb5d677c","Type":"ContainerStarted","Data":"0429ff5f884160565a9cda58cf166816738777ca43ea5e47ebac9e8d47d354ad"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.889179 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-txccq" event={"ID":"7c9952da-6281-45f2-8b45-30caa27b8d39","Type":"ContainerStarted","Data":"98ee6dc5ae584921adf996e86778c6141a6b8e7df5c376e7181885193ecb1399"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.896737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c94c19c-3ccb-43cc-ab41-92baa3141f73","Type":"ContainerStarted","Data":"691c4f9d45de04f6bb32f82d9d22154b130edce7e7b8b75479f100df834dbbad"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.918471 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-91ce-account-create-update-n8mz8" event={"ID":"35fe8af0-2f02-4d81-ae03-9d399900494c","Type":"ContainerStarted","Data":"0f5d94dc5d9bb04750c9b3d2e89fcd5a5d6e20ec2f4a19899cb047b2927291a4"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.918532 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-91ce-account-create-update-n8mz8" event={"ID":"35fe8af0-2f02-4d81-ae03-9d399900494c","Type":"ContainerStarted","Data":"0cb78380f29d4d52692e33442f665655f581626d9173b5bf157abd8b1bb91034"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.928129 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" event={"ID":"fee1eb20-6fbe-4e59-a434-54c2e8a6165d","Type":"ContainerStarted","Data":"34decf19ef7bac29bb5073f92442b233e2f6b57a40f58b92a1d00cba4bde5c43"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.938740 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-91ce-account-create-update-n8mz8" podStartSLOduration=1.938721559 podStartE2EDuration="1.938721559s" podCreationTimestamp="2026-03-11 12:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:45.936681641 +0000 UTC m=+1272.527945608" watchObservedRunningTime="2026-03-11 12:19:45.938721559 +0000 UTC m=+1272.529985526" Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.939080 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerStarted","Data":"17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.954053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4z7mr" event={"ID":"1ec4faaf-e219-4b01-b3b9-0d6757a38154","Type":"ContainerStarted","Data":"fd551dbdb54bb8de807a245da392a1fc03bca9f397e581b66063faafeaf38a5f"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.954432 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4z7mr" event={"ID":"1ec4faaf-e219-4b01-b3b9-0d6757a38154","Type":"ContainerStarted","Data":"2f56abcf45077723001cd758c3b00fda8dcf2b28cf4f89c65baa6a6b4cfb7a38"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.957880 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zv62x" event={"ID":"a0e0ff63-3d12-4174-9341-ceb21109e000","Type":"ContainerStarted","Data":"cbfbf586e19291c8ee373bf860029353b3f56429b7bf9015d736b9982aa4797f"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.958001 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zv62x" event={"ID":"a0e0ff63-3d12-4174-9341-ceb21109e000","Type":"ContainerStarted","Data":"671c06c98bca09ed2d0cbf96fe51a512291245bc7bc97f794c21bccc6c6c997a"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.988099 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-4z7mr" podStartSLOduration=1.988078869 podStartE2EDuration="1.988078869s" podCreationTimestamp="2026-03-11 12:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:45.975059787 +0000 UTC m=+1272.566323754" watchObservedRunningTime="2026-03-11 12:19:45.988078869 +0000 UTC m=+1272.579342836" Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.166049 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.974017 4816 generic.go:334] "Generic (PLEG): container finished" podID="35fe8af0-2f02-4d81-ae03-9d399900494c" containerID="0f5d94dc5d9bb04750c9b3d2e89fcd5a5d6e20ec2f4a19899cb047b2927291a4" exitCode=0 Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.974115 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-91ce-account-create-update-n8mz8" event={"ID":"35fe8af0-2f02-4d81-ae03-9d399900494c","Type":"ContainerDied","Data":"0f5d94dc5d9bb04750c9b3d2e89fcd5a5d6e20ec2f4a19899cb047b2927291a4"} Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.976175 4816 generic.go:334] "Generic (PLEG): container finished" podID="fee1eb20-6fbe-4e59-a434-54c2e8a6165d" containerID="2ba25af6bbe93bf77e8ed2bed1866df9a0d1cdcadbd32ffc70070db8155b1914" exitCode=0 Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.976267 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" event={"ID":"fee1eb20-6fbe-4e59-a434-54c2e8a6165d","Type":"ContainerDied","Data":"2ba25af6bbe93bf77e8ed2bed1866df9a0d1cdcadbd32ffc70070db8155b1914"} Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.978652 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerStarted","Data":"967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c"} Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.980665 4816 generic.go:334] "Generic (PLEG): container finished" podID="1ec4faaf-e219-4b01-b3b9-0d6757a38154" containerID="fd551dbdb54bb8de807a245da392a1fc03bca9f397e581b66063faafeaf38a5f" exitCode=0 Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.980713 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4z7mr" event={"ID":"1ec4faaf-e219-4b01-b3b9-0d6757a38154","Type":"ContainerDied","Data":"fd551dbdb54bb8de807a245da392a1fc03bca9f397e581b66063faafeaf38a5f"} Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.982752 4816 generic.go:334] "Generic (PLEG): container finished" podID="a0e0ff63-3d12-4174-9341-ceb21109e000" containerID="cbfbf586e19291c8ee373bf860029353b3f56429b7bf9015d736b9982aa4797f" exitCode=0 Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.982794 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zv62x" event={"ID":"a0e0ff63-3d12-4174-9341-ceb21109e000","Type":"ContainerDied","Data":"cbfbf586e19291c8ee373bf860029353b3f56429b7bf9015d736b9982aa4797f"} Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.986106 4816 generic.go:334] "Generic (PLEG): container finished" podID="403fec7f-c194-4bdd-a620-34aefb5d677c" containerID="090174f400ae3d182bc1e17d475eb20c26198249c703a798e2b253812bea946b" exitCode=0 Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.986163 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-377b-account-create-update-gb4b2" event={"ID":"403fec7f-c194-4bdd-a620-34aefb5d677c","Type":"ContainerDied","Data":"090174f400ae3d182bc1e17d475eb20c26198249c703a798e2b253812bea946b"} Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.987343 4816 generic.go:334] "Generic (PLEG): container finished" podID="7c9952da-6281-45f2-8b45-30caa27b8d39" containerID="6a15e8693d1f25cf8eeefb7b013bbcd57f9676d5cee6b31111e7f71f5ea2e5ca" exitCode=0 Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.987382 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-txccq" event={"ID":"7c9952da-6281-45f2-8b45-30caa27b8d39","Type":"ContainerDied","Data":"6a15e8693d1f25cf8eeefb7b013bbcd57f9676d5cee6b31111e7f71f5ea2e5ca"} Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.000215 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c94c19c-3ccb-43cc-ab41-92baa3141f73","Type":"ContainerStarted","Data":"c04dc0a2663851eac8a9c1faccfd79cf6c27fbce470c4ad0b7499358caea8a06"} Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.000561 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.207930 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.207911215 podStartE2EDuration="5.207911215s" podCreationTimestamp="2026-03-11 12:19:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:47.15070207 +0000 UTC m=+1273.741966037" watchObservedRunningTime="2026-03-11 12:19:47.207911215 +0000 UTC m=+1273.799175172" Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.665837 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.742195 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e0ff63-3d12-4174-9341-ceb21109e000-operator-scripts\") pod \"a0e0ff63-3d12-4174-9341-ceb21109e000\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.742278 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpzw2\" (UniqueName: \"kubernetes.io/projected/a0e0ff63-3d12-4174-9341-ceb21109e000-kube-api-access-lpzw2\") pod \"a0e0ff63-3d12-4174-9341-ceb21109e000\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.742838 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e0ff63-3d12-4174-9341-ceb21109e000-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0e0ff63-3d12-4174-9341-ceb21109e000" (UID: "a0e0ff63-3d12-4174-9341-ceb21109e000"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.749790 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e0ff63-3d12-4174-9341-ceb21109e000-kube-api-access-lpzw2" (OuterVolumeSpecName: "kube-api-access-lpzw2") pod "a0e0ff63-3d12-4174-9341-ceb21109e000" (UID: "a0e0ff63-3d12-4174-9341-ceb21109e000"). InnerVolumeSpecName "kube-api-access-lpzw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.843817 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e0ff63-3d12-4174-9341-ceb21109e000-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.843865 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpzw2\" (UniqueName: \"kubernetes.io/projected/a0e0ff63-3d12-4174-9341-ceb21109e000-kube-api-access-lpzw2\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.013052 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zv62x" event={"ID":"a0e0ff63-3d12-4174-9341-ceb21109e000","Type":"ContainerDied","Data":"671c06c98bca09ed2d0cbf96fe51a512291245bc7bc97f794c21bccc6c6c997a"} Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.013113 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="671c06c98bca09ed2d0cbf96fe51a512291245bc7bc97f794c21bccc6c6c997a" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.014451 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.016338 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerStarted","Data":"0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6"} Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.352716 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.436883 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9df8757bb-rzb52"] Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.437307 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-9df8757bb-rzb52" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-api" containerID="cri-o://385c6a6a7483bf3ffb2a31553a973012c1161303ce29917595a5f314788786f7" gracePeriod=30 Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.437732 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-9df8757bb-rzb52" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-httpd" containerID="cri-o://3a4b8f5199cb2db96176f7d26ac1288036fcf9dd3deb012c7c6cb2bd6febc6c2" gracePeriod=30 Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.535621 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.560115 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c9952da-6281-45f2-8b45-30caa27b8d39-operator-scripts\") pod \"7c9952da-6281-45f2-8b45-30caa27b8d39\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.560179 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6tl\" (UniqueName: \"kubernetes.io/projected/7c9952da-6281-45f2-8b45-30caa27b8d39-kube-api-access-sb6tl\") pod \"7c9952da-6281-45f2-8b45-30caa27b8d39\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.561501 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c9952da-6281-45f2-8b45-30caa27b8d39-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c9952da-6281-45f2-8b45-30caa27b8d39" (UID: "7c9952da-6281-45f2-8b45-30caa27b8d39"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.568338 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c9952da-6281-45f2-8b45-30caa27b8d39-kube-api-access-sb6tl" (OuterVolumeSpecName: "kube-api-access-sb6tl") pod "7c9952da-6281-45f2-8b45-30caa27b8d39" (UID: "7c9952da-6281-45f2-8b45-30caa27b8d39"). InnerVolumeSpecName "kube-api-access-sb6tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.666161 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c9952da-6281-45f2-8b45-30caa27b8d39-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.666197 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6tl\" (UniqueName: \"kubernetes.io/projected/7c9952da-6281-45f2-8b45-30caa27b8d39-kube-api-access-sb6tl\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.963864 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.976118 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.984306 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.009888 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.054679 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" event={"ID":"fee1eb20-6fbe-4e59-a434-54c2e8a6165d","Type":"ContainerDied","Data":"34decf19ef7bac29bb5073f92442b233e2f6b57a40f58b92a1d00cba4bde5c43"} Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.054730 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34decf19ef7bac29bb5073f92442b233e2f6b57a40f58b92a1d00cba4bde5c43" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.054807 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.079420 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.079914 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec4faaf-e219-4b01-b3b9-0d6757a38154-operator-scripts\") pod \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.080070 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4z7mr" event={"ID":"1ec4faaf-e219-4b01-b3b9-0d6757a38154","Type":"ContainerDied","Data":"2f56abcf45077723001cd758c3b00fda8dcf2b28cf4f89c65baa6a6b4cfb7a38"} Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.080113 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f56abcf45077723001cd758c3b00fda8dcf2b28cf4f89c65baa6a6b4cfb7a38" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.080317 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrnr5\" (UniqueName: \"kubernetes.io/projected/1ec4faaf-e219-4b01-b3b9-0d6757a38154-kube-api-access-vrnr5\") pod \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.082433 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ec4faaf-e219-4b01-b3b9-0d6757a38154-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ec4faaf-e219-4b01-b3b9-0d6757a38154" (UID: "1ec4faaf-e219-4b01-b3b9-0d6757a38154"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.090956 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-377b-account-create-update-gb4b2" event={"ID":"403fec7f-c194-4bdd-a620-34aefb5d677c","Type":"ContainerDied","Data":"0429ff5f884160565a9cda58cf166816738777ca43ea5e47ebac9e8d47d354ad"} Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.091018 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0429ff5f884160565a9cda58cf166816738777ca43ea5e47ebac9e8d47d354ad" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.091148 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.093152 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec4faaf-e219-4b01-b3b9-0d6757a38154-kube-api-access-vrnr5" (OuterVolumeSpecName: "kube-api-access-vrnr5") pod "1ec4faaf-e219-4b01-b3b9-0d6757a38154" (UID: "1ec4faaf-e219-4b01-b3b9-0d6757a38154"). InnerVolumeSpecName "kube-api-access-vrnr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.099068 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-txccq" event={"ID":"7c9952da-6281-45f2-8b45-30caa27b8d39","Type":"ContainerDied","Data":"98ee6dc5ae584921adf996e86778c6141a6b8e7df5c376e7181885193ecb1399"} Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.099120 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98ee6dc5ae584921adf996e86778c6141a6b8e7df5c376e7181885193ecb1399" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.099227 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.105505 4816 generic.go:334] "Generic (PLEG): container finished" podID="68498f16-b5c3-4960-8565-7ae628fc3122" containerID="3a4b8f5199cb2db96176f7d26ac1288036fcf9dd3deb012c7c6cb2bd6febc6c2" exitCode=0 Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.105623 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9df8757bb-rzb52" event={"ID":"68498f16-b5c3-4960-8565-7ae628fc3122","Type":"ContainerDied","Data":"3a4b8f5199cb2db96176f7d26ac1288036fcf9dd3deb012c7c6cb2bd6febc6c2"} Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.107224 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-91ce-account-create-update-n8mz8" event={"ID":"35fe8af0-2f02-4d81-ae03-9d399900494c","Type":"ContainerDied","Data":"0cb78380f29d4d52692e33442f665655f581626d9173b5bf157abd8b1bb91034"} Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.107282 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cb78380f29d4d52692e33442f665655f581626d9173b5bf157abd8b1bb91034" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.107346 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.182215 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403fec7f-c194-4bdd-a620-34aefb5d677c-operator-scripts\") pod \"403fec7f-c194-4bdd-a620-34aefb5d677c\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.182475 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-operator-scripts\") pod \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.182507 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35fe8af0-2f02-4d81-ae03-9d399900494c-operator-scripts\") pod \"35fe8af0-2f02-4d81-ae03-9d399900494c\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.182571 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjhm8\" (UniqueName: \"kubernetes.io/projected/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-kube-api-access-zjhm8\") pod \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.182705 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwq9l\" (UniqueName: \"kubernetes.io/projected/403fec7f-c194-4bdd-a620-34aefb5d677c-kube-api-access-dwq9l\") pod \"403fec7f-c194-4bdd-a620-34aefb5d677c\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.182728 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-952tt\" (UniqueName: \"kubernetes.io/projected/35fe8af0-2f02-4d81-ae03-9d399900494c-kube-api-access-952tt\") pod \"35fe8af0-2f02-4d81-ae03-9d399900494c\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183035 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35fe8af0-2f02-4d81-ae03-9d399900494c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35fe8af0-2f02-4d81-ae03-9d399900494c" (UID: "35fe8af0-2f02-4d81-ae03-9d399900494c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183071 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fee1eb20-6fbe-4e59-a434-54c2e8a6165d" (UID: "fee1eb20-6fbe-4e59-a434-54c2e8a6165d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183115 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/403fec7f-c194-4bdd-a620-34aefb5d677c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "403fec7f-c194-4bdd-a620-34aefb5d677c" (UID: "403fec7f-c194-4bdd-a620-34aefb5d677c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183492 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403fec7f-c194-4bdd-a620-34aefb5d677c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183600 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrnr5\" (UniqueName: \"kubernetes.io/projected/1ec4faaf-e219-4b01-b3b9-0d6757a38154-kube-api-access-vrnr5\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183621 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec4faaf-e219-4b01-b3b9-0d6757a38154-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183631 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183642 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35fe8af0-2f02-4d81-ae03-9d399900494c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.187178 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-kube-api-access-zjhm8" (OuterVolumeSpecName: "kube-api-access-zjhm8") pod "fee1eb20-6fbe-4e59-a434-54c2e8a6165d" (UID: "fee1eb20-6fbe-4e59-a434-54c2e8a6165d"). InnerVolumeSpecName "kube-api-access-zjhm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.187226 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/403fec7f-c194-4bdd-a620-34aefb5d677c-kube-api-access-dwq9l" (OuterVolumeSpecName: "kube-api-access-dwq9l") pod "403fec7f-c194-4bdd-a620-34aefb5d677c" (UID: "403fec7f-c194-4bdd-a620-34aefb5d677c"). InnerVolumeSpecName "kube-api-access-dwq9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.188526 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35fe8af0-2f02-4d81-ae03-9d399900494c-kube-api-access-952tt" (OuterVolumeSpecName: "kube-api-access-952tt") pod "35fe8af0-2f02-4d81-ae03-9d399900494c" (UID: "35fe8af0-2f02-4d81-ae03-9d399900494c"). InnerVolumeSpecName "kube-api-access-952tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.286135 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwq9l\" (UniqueName: \"kubernetes.io/projected/403fec7f-c194-4bdd-a620-34aefb5d677c-kube-api-access-dwq9l\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.286175 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-952tt\" (UniqueName: \"kubernetes.io/projected/35fe8af0-2f02-4d81-ae03-9d399900494c-kube-api-access-952tt\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.286187 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjhm8\" (UniqueName: \"kubernetes.io/projected/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-kube-api-access-zjhm8\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.017154 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.025259 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.124541 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerStarted","Data":"e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a"} Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.124791 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-central-agent" containerID="cri-o://17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8" gracePeriod=30 Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.124829 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="proxy-httpd" containerID="cri-o://e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a" gracePeriod=30 Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.124914 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="sg-core" containerID="cri-o://0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6" gracePeriod=30 Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.124973 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-notification-agent" containerID="cri-o://967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c" gracePeriod=30 Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.179741 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.980110999 podStartE2EDuration="8.179710721s" podCreationTimestamp="2026-03-11 12:19:42 +0000 UTC" firstStartedPulling="2026-03-11 12:19:43.952563467 +0000 UTC m=+1270.543827434" lastFinishedPulling="2026-03-11 12:19:49.152163179 +0000 UTC m=+1275.743427156" observedRunningTime="2026-03-11 12:19:50.166276667 +0000 UTC m=+1276.757540634" watchObservedRunningTime="2026-03-11 12:19:50.179710721 +0000 UTC m=+1276.770974688" Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.490482 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-64584d7649-mb6k8" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 12:19:51 crc kubenswrapper[4816]: I0311 12:19:51.152619 4816 generic.go:334] "Generic (PLEG): container finished" podID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerID="e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a" exitCode=0 Mar 11 12:19:51 crc kubenswrapper[4816]: I0311 12:19:51.152671 4816 generic.go:334] "Generic (PLEG): container finished" podID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerID="0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6" exitCode=2 Mar 11 12:19:51 crc kubenswrapper[4816]: I0311 12:19:51.152685 4816 generic.go:334] "Generic (PLEG): container finished" podID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerID="967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c" exitCode=0 Mar 11 12:19:51 crc kubenswrapper[4816]: I0311 12:19:51.152714 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerDied","Data":"e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a"} Mar 11 12:19:51 crc kubenswrapper[4816]: I0311 12:19:51.152750 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerDied","Data":"0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6"} Mar 11 12:19:51 crc kubenswrapper[4816]: I0311 12:19:51.152765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerDied","Data":"967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c"} Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.631188 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.670180 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-config-data\") pod \"fe1c6061-c54b-4bd7-bcff-1a0047599189\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.670273 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-log-httpd\") pod \"fe1c6061-c54b-4bd7-bcff-1a0047599189\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.670394 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-scripts\") pod \"fe1c6061-c54b-4bd7-bcff-1a0047599189\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.670439 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-sg-core-conf-yaml\") pod \"fe1c6061-c54b-4bd7-bcff-1a0047599189\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.670563 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh28p\" (UniqueName: \"kubernetes.io/projected/fe1c6061-c54b-4bd7-bcff-1a0047599189-kube-api-access-gh28p\") pod \"fe1c6061-c54b-4bd7-bcff-1a0047599189\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.670673 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-combined-ca-bundle\") pod \"fe1c6061-c54b-4bd7-bcff-1a0047599189\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.670726 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-run-httpd\") pod \"fe1c6061-c54b-4bd7-bcff-1a0047599189\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.671142 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fe1c6061-c54b-4bd7-bcff-1a0047599189" (UID: "fe1c6061-c54b-4bd7-bcff-1a0047599189"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.671782 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.679458 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe1c6061-c54b-4bd7-bcff-1a0047599189-kube-api-access-gh28p" (OuterVolumeSpecName: "kube-api-access-gh28p") pod "fe1c6061-c54b-4bd7-bcff-1a0047599189" (UID: "fe1c6061-c54b-4bd7-bcff-1a0047599189"). InnerVolumeSpecName "kube-api-access-gh28p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.684789 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fe1c6061-c54b-4bd7-bcff-1a0047599189" (UID: "fe1c6061-c54b-4bd7-bcff-1a0047599189"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.688594 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-scripts" (OuterVolumeSpecName: "scripts") pod "fe1c6061-c54b-4bd7-bcff-1a0047599189" (UID: "fe1c6061-c54b-4bd7-bcff-1a0047599189"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.739780 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fe1c6061-c54b-4bd7-bcff-1a0047599189" (UID: "fe1c6061-c54b-4bd7-bcff-1a0047599189"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.774324 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.774363 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.774375 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh28p\" (UniqueName: \"kubernetes.io/projected/fe1c6061-c54b-4bd7-bcff-1a0047599189-kube-api-access-gh28p\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.774388 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.777951 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe1c6061-c54b-4bd7-bcff-1a0047599189" (UID: "fe1c6061-c54b-4bd7-bcff-1a0047599189"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.789349 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-config-data" (OuterVolumeSpecName: "config-data") pod "fe1c6061-c54b-4bd7-bcff-1a0047599189" (UID: "fe1c6061-c54b-4bd7-bcff-1a0047599189"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.877340 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.877732 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.175956 4816 generic.go:334] "Generic (PLEG): container finished" podID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerID="17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8" exitCode=0 Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.176030 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerDied","Data":"17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8"} Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.176087 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerDied","Data":"bf43034a6d989e03fbb9afda66ffef8a89a7703f9ab28abf0ae391e957eb6554"} Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.176113 4816 scope.go:117] "RemoveContainer" containerID="e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.176365 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.216178 4816 scope.go:117] "RemoveContainer" containerID="0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.245644 4816 scope.go:117] "RemoveContainer" containerID="967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.252617 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.271560 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.278821 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279291 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="sg-core" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279308 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="sg-core" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279326 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-central-agent" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279333 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-central-agent" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279345 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35fe8af0-2f02-4d81-ae03-9d399900494c" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279351 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="35fe8af0-2f02-4d81-ae03-9d399900494c" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279364 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-notification-agent" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279370 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-notification-agent" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279384 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403fec7f-c194-4bdd-a620-34aefb5d677c" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279389 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="403fec7f-c194-4bdd-a620-34aefb5d677c" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279398 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e0ff63-3d12-4174-9341-ceb21109e000" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279403 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e0ff63-3d12-4174-9341-ceb21109e000" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279433 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c9952da-6281-45f2-8b45-30caa27b8d39" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279439 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c9952da-6281-45f2-8b45-30caa27b8d39" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279448 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee1eb20-6fbe-4e59-a434-54c2e8a6165d" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279454 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee1eb20-6fbe-4e59-a434-54c2e8a6165d" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279466 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec4faaf-e219-4b01-b3b9-0d6757a38154" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279471 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec4faaf-e219-4b01-b3b9-0d6757a38154" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279483 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="proxy-httpd" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279489 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="proxy-httpd" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279660 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="403fec7f-c194-4bdd-a620-34aefb5d677c" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279677 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="sg-core" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279686 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec4faaf-e219-4b01-b3b9-0d6757a38154" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279696 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-central-agent" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279705 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e0ff63-3d12-4174-9341-ceb21109e000" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279716 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c9952da-6281-45f2-8b45-30caa27b8d39" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279725 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-notification-agent" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279736 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="35fe8af0-2f02-4d81-ae03-9d399900494c" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279751 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="proxy-httpd" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279756 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee1eb20-6fbe-4e59-a434-54c2e8a6165d" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.281684 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.289688 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.289703 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.343960 4816 scope.go:117] "RemoveContainer" containerID="17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.355668 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390063 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390123 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-run-httpd\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390160 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-scripts\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390208 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-log-httpd\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390282 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc2gb\" (UniqueName: \"kubernetes.io/projected/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-kube-api-access-wc2gb\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390317 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390344 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-config-data\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390686 4816 scope.go:117] "RemoveContainer" containerID="e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.391468 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a\": container with ID starting with e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a not found: ID does not exist" containerID="e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.391559 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a"} err="failed to get container status \"e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a\": rpc error: code = NotFound desc = could not find container \"e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a\": container with ID starting with e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a not found: ID does not exist" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.391604 4816 scope.go:117] "RemoveContainer" containerID="0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.392795 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6\": container with ID starting with 0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6 not found: ID does not exist" containerID="0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.392824 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6"} err="failed to get container status \"0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6\": rpc error: code = NotFound desc = could not find container \"0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6\": container with ID starting with 0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6 not found: ID does not exist" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.392843 4816 scope.go:117] "RemoveContainer" containerID="967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.393466 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c\": container with ID starting with 967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c not found: ID does not exist" containerID="967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.393503 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c"} err="failed to get container status \"967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c\": rpc error: code = NotFound desc = could not find container \"967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c\": container with ID starting with 967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c not found: ID does not exist" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.393530 4816 scope.go:117] "RemoveContainer" containerID="17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.398422 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8\": container with ID starting with 17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8 not found: ID does not exist" containerID="17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.398526 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8"} err="failed to get container status \"17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8\": rpc error: code = NotFound desc = could not find container \"17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8\": container with ID starting with 17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8 not found: ID does not exist" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.492026 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc2gb\" (UniqueName: \"kubernetes.io/projected/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-kube-api-access-wc2gb\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.492107 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.492149 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-config-data\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.492204 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.492230 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-run-httpd\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.492268 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-scripts\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.492313 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-log-httpd\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.493025 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-run-httpd\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.493156 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-log-httpd\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.498579 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.500285 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-scripts\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.500993 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.516225 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc2gb\" (UniqueName: \"kubernetes.io/projected/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-kube-api-access-wc2gb\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.522283 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-config-data\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.661687 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.145256 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" path="/var/lib/kubelet/pods/fe1c6061-c54b-4bd7-bcff-1a0047599189/volumes" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.195789 4816 generic.go:334] "Generic (PLEG): container finished" podID="68498f16-b5c3-4960-8565-7ae628fc3122" containerID="385c6a6a7483bf3ffb2a31553a973012c1161303ce29917595a5f314788786f7" exitCode=0 Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.195856 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9df8757bb-rzb52" event={"ID":"68498f16-b5c3-4960-8565-7ae628fc3122","Type":"ContainerDied","Data":"385c6a6a7483bf3ffb2a31553a973012c1161303ce29917595a5f314788786f7"} Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.209501 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.266714 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.326154 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-combined-ca-bundle\") pod \"68498f16-b5c3-4960-8565-7ae628fc3122\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.326706 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-config\") pod \"68498f16-b5c3-4960-8565-7ae628fc3122\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.327435 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-ovndb-tls-certs\") pod \"68498f16-b5c3-4960-8565-7ae628fc3122\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.327586 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-httpd-config\") pod \"68498f16-b5c3-4960-8565-7ae628fc3122\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.327849 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9clks\" (UniqueName: \"kubernetes.io/projected/68498f16-b5c3-4960-8565-7ae628fc3122-kube-api-access-9clks\") pod \"68498f16-b5c3-4960-8565-7ae628fc3122\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.337876 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "68498f16-b5c3-4960-8565-7ae628fc3122" (UID: "68498f16-b5c3-4960-8565-7ae628fc3122"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.348493 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68498f16-b5c3-4960-8565-7ae628fc3122-kube-api-access-9clks" (OuterVolumeSpecName: "kube-api-access-9clks") pod "68498f16-b5c3-4960-8565-7ae628fc3122" (UID: "68498f16-b5c3-4960-8565-7ae628fc3122"). InnerVolumeSpecName "kube-api-access-9clks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.410006 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-config" (OuterVolumeSpecName: "config") pod "68498f16-b5c3-4960-8565-7ae628fc3122" (UID: "68498f16-b5c3-4960-8565-7ae628fc3122"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.417416 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68498f16-b5c3-4960-8565-7ae628fc3122" (UID: "68498f16-b5c3-4960-8565-7ae628fc3122"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.432288 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.432341 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.432352 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.432361 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9clks\" (UniqueName: \"kubernetes.io/projected/68498f16-b5c3-4960-8565-7ae628fc3122-kube-api-access-9clks\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.452364 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "68498f16-b5c3-4960-8565-7ae628fc3122" (UID: "68498f16-b5c3-4960-8565-7ae628fc3122"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.535157 4816 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.105059 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r2t5s"] Mar 11 12:19:55 crc kubenswrapper[4816]: E0311 12:19:55.105581 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-httpd" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.105595 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-httpd" Mar 11 12:19:55 crc kubenswrapper[4816]: E0311 12:19:55.105604 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-api" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.105611 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-api" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.105838 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-httpd" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.105856 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-api" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.106618 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.112556 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.113018 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.116321 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-f2q4d" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.147876 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-scripts\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.148036 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpt2w\" (UniqueName: \"kubernetes.io/projected/6268fe92-5c93-43c7-95bc-f30befda5d65-kube-api-access-zpt2w\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.148943 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.149011 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-config-data\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.149436 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r2t5s"] Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.208588 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.208617 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9df8757bb-rzb52" event={"ID":"68498f16-b5c3-4960-8565-7ae628fc3122","Type":"ContainerDied","Data":"ca48397e5444728848156fabb2c1b9060ca19d57a1c1905996ce53cd9a54fc09"} Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.208698 4816 scope.go:117] "RemoveContainer" containerID="3a4b8f5199cb2db96176f7d26ac1288036fcf9dd3deb012c7c6cb2bd6febc6c2" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.210744 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerStarted","Data":"bab09cd01583eebdccfb229a37532b6f0674000f5d1606f07d42c8adaa348948"} Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.251586 4816 scope.go:117] "RemoveContainer" containerID="385c6a6a7483bf3ffb2a31553a973012c1161303ce29917595a5f314788786f7" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.252284 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-config-data\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.252504 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-scripts\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.252659 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpt2w\" (UniqueName: \"kubernetes.io/projected/6268fe92-5c93-43c7-95bc-f30befda5d65-kube-api-access-zpt2w\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.252912 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.259637 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9df8757bb-rzb52"] Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.263038 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-scripts\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.263038 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.265627 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-config-data\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.273602 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpt2w\" (UniqueName: \"kubernetes.io/projected/6268fe92-5c93-43c7-95bc-f30befda5d65-kube-api-access-zpt2w\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.293330 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9df8757bb-rzb52"] Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.435132 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.552101 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 11 12:19:56 crc kubenswrapper[4816]: W0311 12:19:56.055458 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6268fe92_5c93_43c7_95bc_f30befda5d65.slice/crio-1939b4a789439a60faf8174002db0eb6692620810e7f2821d03a1a8fa9509b1e WatchSource:0}: Error finding container 1939b4a789439a60faf8174002db0eb6692620810e7f2821d03a1a8fa9509b1e: Status 404 returned error can't find the container with id 1939b4a789439a60faf8174002db0eb6692620810e7f2821d03a1a8fa9509b1e Mar 11 12:19:56 crc kubenswrapper[4816]: I0311 12:19:56.057872 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r2t5s"] Mar 11 12:19:56 crc kubenswrapper[4816]: I0311 12:19:56.142512 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" path="/var/lib/kubelet/pods/68498f16-b5c3-4960-8565-7ae628fc3122/volumes" Mar 11 12:19:56 crc kubenswrapper[4816]: I0311 12:19:56.225133 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerStarted","Data":"29fe9c7ac5f65d3d19f417a0d611bc3a79cf763f7ef21444af5a797e56f3f63f"} Mar 11 12:19:56 crc kubenswrapper[4816]: I0311 12:19:56.226931 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" event={"ID":"6268fe92-5c93-43c7-95bc-f30befda5d65","Type":"ContainerStarted","Data":"1939b4a789439a60faf8174002db0eb6692620810e7f2821d03a1a8fa9509b1e"} Mar 11 12:19:58 crc kubenswrapper[4816]: I0311 12:19:58.119373 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:58 crc kubenswrapper[4816]: I0311 12:19:58.254678 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerStarted","Data":"caf5f5963ed5f620c8712cea969e0fcf607060ab3626bd9f71ed7c1f2fef14cd"} Mar 11 12:19:58 crc kubenswrapper[4816]: I0311 12:19:58.254749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerStarted","Data":"09ea84d41bea9be23219e7a701b78afcc81f9e1c777303de3089f54128f5a641"} Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.143872 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553860-9kp4n"] Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.146083 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553860-9kp4n" Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.149462 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.149786 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.150018 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.152057 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553860-9kp4n"] Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.295151 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdcqf\" (UniqueName: \"kubernetes.io/projected/4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9-kube-api-access-wdcqf\") pod \"auto-csr-approver-29553860-9kp4n\" (UID: \"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9\") " pod="openshift-infra/auto-csr-approver-29553860-9kp4n" Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.397987 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdcqf\" (UniqueName: \"kubernetes.io/projected/4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9-kube-api-access-wdcqf\") pod \"auto-csr-approver-29553860-9kp4n\" (UID: \"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9\") " pod="openshift-infra/auto-csr-approver-29553860-9kp4n" Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.430574 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdcqf\" (UniqueName: \"kubernetes.io/projected/4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9-kube-api-access-wdcqf\") pod \"auto-csr-approver-29553860-9kp4n\" (UID: \"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9\") " pod="openshift-infra/auto-csr-approver-29553860-9kp4n" Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.467220 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553860-9kp4n" Mar 11 12:20:02 crc kubenswrapper[4816]: I0311 12:20:02.629683 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:20:02 crc kubenswrapper[4816]: I0311 12:20:02.630571 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-log" containerID="cri-o://4b5cec87927ba388b30feb742e4d193b529502bf6a8355ed2d02b5d41c560b67" gracePeriod=30 Mar 11 12:20:02 crc kubenswrapper[4816]: I0311 12:20:02.631281 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-httpd" containerID="cri-o://ecaa5276e0e1e71d262bf64a26711871fc4a429158857be7af8e6465f4bd05ea" gracePeriod=30 Mar 11 12:20:03 crc kubenswrapper[4816]: I0311 12:20:03.325998 4816 generic.go:334] "Generic (PLEG): container finished" podID="439b686e-927d-425a-a218-807220ae1e95" containerID="4b5cec87927ba388b30feb742e4d193b529502bf6a8355ed2d02b5d41c560b67" exitCode=143 Mar 11 12:20:03 crc kubenswrapper[4816]: I0311 12:20:03.326073 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"439b686e-927d-425a-a218-807220ae1e95","Type":"ContainerDied","Data":"4b5cec87927ba388b30feb742e4d193b529502bf6a8355ed2d02b5d41c560b67"} Mar 11 12:20:03 crc kubenswrapper[4816]: I0311 12:20:03.557107 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:20:03 crc kubenswrapper[4816]: I0311 12:20:03.558590 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-httpd" containerID="cri-o://44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15" gracePeriod=30 Mar 11 12:20:03 crc kubenswrapper[4816]: I0311 12:20:03.559069 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-log" containerID="cri-o://d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b" gracePeriod=30 Mar 11 12:20:03 crc kubenswrapper[4816]: E0311 12:20:03.781421 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9d3606c_b28d_4028_93fc_535afa127cd6.slice/crio-conmon-d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b.scope\": RecentStats: unable to find data in memory cache]" Mar 11 12:20:04 crc kubenswrapper[4816]: I0311 12:20:04.352508 4816 generic.go:334] "Generic (PLEG): container finished" podID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerID="d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b" exitCode=143 Mar 11 12:20:04 crc kubenswrapper[4816]: I0311 12:20:04.352651 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9d3606c-b28d-4028-93fc-535afa127cd6","Type":"ContainerDied","Data":"d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b"} Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.382902 4816 generic.go:334] "Generic (PLEG): container finished" podID="439b686e-927d-425a-a218-807220ae1e95" containerID="ecaa5276e0e1e71d262bf64a26711871fc4a429158857be7af8e6465f4bd05ea" exitCode=0 Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.383053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"439b686e-927d-425a-a218-807220ae1e95","Type":"ContainerDied","Data":"ecaa5276e0e1e71d262bf64a26711871fc4a429158857be7af8e6465f4bd05ea"} Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.633083 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.725374 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553860-9kp4n"] Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753181 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-httpd-run\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753234 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753289 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-config-data\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753384 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlzfp\" (UniqueName: \"kubernetes.io/projected/439b686e-927d-425a-a218-807220ae1e95-kube-api-access-mlzfp\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753407 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-scripts\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753594 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-public-tls-certs\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753622 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-logs\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753695 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-combined-ca-bundle\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.754573 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.756635 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-logs" (OuterVolumeSpecName: "logs") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.770129 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-scripts" (OuterVolumeSpecName: "scripts") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.772719 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/439b686e-927d-425a-a218-807220ae1e95-kube-api-access-mlzfp" (OuterVolumeSpecName: "kube-api-access-mlzfp") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "kube-api-access-mlzfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.778845 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.798483 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.856972 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.857033 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.857054 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.857100 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.857114 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlzfp\" (UniqueName: \"kubernetes.io/projected/439b686e-927d-425a-a218-807220ae1e95-kube-api-access-mlzfp\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.857134 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.868269 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.891693 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-config-data" (OuterVolumeSpecName: "config-data") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.903537 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.960324 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.960370 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.960385 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.265382 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.368661 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-httpd-run\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.368771 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-logs\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.368850 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-scripts\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.368880 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-internal-tls-certs\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.368938 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.369023 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-config-data\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.369112 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-combined-ca-bundle\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.369150 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tthz\" (UniqueName: \"kubernetes.io/projected/a9d3606c-b28d-4028-93fc-535afa127cd6-kube-api-access-2tthz\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.369375 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-logs" (OuterVolumeSpecName: "logs") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.369425 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.370101 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.370128 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.378131 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9d3606c-b28d-4028-93fc-535afa127cd6-kube-api-access-2tthz" (OuterVolumeSpecName: "kube-api-access-2tthz") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "kube-api-access-2tthz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.383840 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.398739 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-scripts" (OuterVolumeSpecName: "scripts") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.408226 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerStarted","Data":"2482598d21f5c9ee2cffe3291bcb032276779844be127265c434d4ee3a10dd01"} Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.408454 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-central-agent" containerID="cri-o://29fe9c7ac5f65d3d19f417a0d611bc3a79cf763f7ef21444af5a797e56f3f63f" gracePeriod=30 Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.408776 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.410392 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="proxy-httpd" containerID="cri-o://2482598d21f5c9ee2cffe3291bcb032276779844be127265c434d4ee3a10dd01" gracePeriod=30 Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.410427 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-notification-agent" containerID="cri-o://09ea84d41bea9be23219e7a701b78afcc81f9e1c777303de3089f54128f5a641" gracePeriod=30 Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.410370 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="sg-core" containerID="cri-o://caf5f5963ed5f620c8712cea969e0fcf607060ab3626bd9f71ed7c1f2fef14cd" gracePeriod=30 Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.427408 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.427744 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"439b686e-927d-425a-a218-807220ae1e95","Type":"ContainerDied","Data":"d6a32b27dcd7e08e03a755df62f2d58811a9c80acc32eb96770a7186a1ec069d"} Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.427804 4816 scope.go:117] "RemoveContainer" containerID="ecaa5276e0e1e71d262bf64a26711871fc4a429158857be7af8e6465f4bd05ea" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.429520 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.441927 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553860-9kp4n" event={"ID":"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9","Type":"ContainerStarted","Data":"945268d86ba621c8fa8980dff5e43070ffe204d163fdf1c51439bce4ca2b4338"} Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.448417 4816 generic.go:334] "Generic (PLEG): container finished" podID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerID="44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15" exitCode=0 Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.448500 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9d3606c-b28d-4028-93fc-535afa127cd6","Type":"ContainerDied","Data":"44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15"} Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.448540 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9d3606c-b28d-4028-93fc-535afa127cd6","Type":"ContainerDied","Data":"ccc820b0417c4ece231f5070aebe453d4f5f6552e1d188623620714789da98ed"} Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.448607 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.455437 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.062832305 podStartE2EDuration="14.455415144s" podCreationTimestamp="2026-03-11 12:19:53 +0000 UTC" firstStartedPulling="2026-03-11 12:19:54.269261435 +0000 UTC m=+1280.860525402" lastFinishedPulling="2026-03-11 12:20:05.661844274 +0000 UTC m=+1292.253108241" observedRunningTime="2026-03-11 12:20:07.45491743 +0000 UTC m=+1294.046181417" watchObservedRunningTime="2026-03-11 12:20:07.455415144 +0000 UTC m=+1294.046679111" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.456481 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-config-data" (OuterVolumeSpecName: "config-data") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.461853 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" event={"ID":"6268fe92-5c93-43c7-95bc-f30befda5d65","Type":"ContainerStarted","Data":"f424c56cb69a088a064cc5d2e599b7db758b5e10ecf876c1586a8816a4d96acd"} Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.479533 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.479584 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.479598 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.479613 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tthz\" (UniqueName: \"kubernetes.io/projected/a9d3606c-b28d-4028-93fc-535afa127cd6-kube-api-access-2tthz\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.479625 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.498866 4816 scope.go:117] "RemoveContainer" containerID="4b5cec87927ba388b30feb742e4d193b529502bf6a8355ed2d02b5d41c560b67" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.504230 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.509985 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.522109 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.527675 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543071 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: E0311 12:20:07.543575 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-log" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543594 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-log" Mar 11 12:20:07 crc kubenswrapper[4816]: E0311 12:20:07.543611 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-httpd" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543619 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-httpd" Mar 11 12:20:07 crc kubenswrapper[4816]: E0311 12:20:07.543643 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-log" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543650 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-log" Mar 11 12:20:07 crc kubenswrapper[4816]: E0311 12:20:07.543669 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-httpd" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543676 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-httpd" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543847 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-httpd" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543865 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-log" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543874 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-log" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543881 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-httpd" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.544907 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.548589 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.574307 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.579485 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" podStartSLOduration=2.38695866 podStartE2EDuration="12.579454028s" podCreationTimestamp="2026-03-11 12:19:55 +0000 UTC" firstStartedPulling="2026-03-11 12:19:56.062334 +0000 UTC m=+1282.653597967" lastFinishedPulling="2026-03-11 12:20:06.254829368 +0000 UTC m=+1292.846093335" observedRunningTime="2026-03-11 12:20:07.513701079 +0000 UTC m=+1294.104965046" watchObservedRunningTime="2026-03-11 12:20:07.579454028 +0000 UTC m=+1294.170717995" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.582215 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.582242 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.600339 4816 scope.go:117] "RemoveContainer" containerID="44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.600436 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.627874 4816 scope.go:117] "RemoveContainer" containerID="d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.647104 4816 scope.go:117] "RemoveContainer" containerID="44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15" Mar 11 12:20:07 crc kubenswrapper[4816]: E0311 12:20:07.647742 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15\": container with ID starting with 44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15 not found: ID does not exist" containerID="44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.647777 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15"} err="failed to get container status \"44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15\": rpc error: code = NotFound desc = could not find container \"44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15\": container with ID starting with 44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15 not found: ID does not exist" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.647803 4816 scope.go:117] "RemoveContainer" containerID="d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b" Mar 11 12:20:07 crc kubenswrapper[4816]: E0311 12:20:07.648837 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b\": container with ID starting with d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b not found: ID does not exist" containerID="d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.648868 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b"} err="failed to get container status \"d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b\": rpc error: code = NotFound desc = could not find container \"d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b\": container with ID starting with d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b not found: ID does not exist" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684402 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684510 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-config-data\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684540 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684603 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-logs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684623 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr9hs\" (UniqueName: \"kubernetes.io/projected/7457f2db-7979-4d92-bd90-a1464b8a3878-kube-api-access-rr9hs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684647 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-scripts\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684664 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684695 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790638 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790760 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-logs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790791 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr9hs\" (UniqueName: \"kubernetes.io/projected/7457f2db-7979-4d92-bd90-a1464b8a3878-kube-api-access-rr9hs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790826 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-scripts\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790848 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790881 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790920 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790971 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-config-data\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.793512 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.793819 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-logs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.794016 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.800347 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-config-data\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.812587 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-scripts\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.813442 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.831082 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.834394 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr9hs\" (UniqueName: \"kubernetes.io/projected/7457f2db-7979-4d92-bd90-a1464b8a3878-kube-api-access-rr9hs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.840306 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.849851 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.854622 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.864434 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.868994 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.874891 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.875348 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.875768 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.885887 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.997794 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.997871 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.997924 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.997988 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.998010 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8b85\" (UniqueName: \"kubernetes.io/projected/e95ddca0-76d0-4dce-9983-4b07655adc25-kube-api-access-c8b85\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.998052 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.998340 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-logs\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.998403 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100660 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8b85\" (UniqueName: \"kubernetes.io/projected/e95ddca0-76d0-4dce-9983-4b07655adc25-kube-api-access-c8b85\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100713 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100755 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100822 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-logs\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100851 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100892 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100927 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100957 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.102225 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-logs\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.102913 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.104590 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.118845 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.119003 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.119947 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.120231 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.128268 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8b85\" (UniqueName: \"kubernetes.io/projected/e95ddca0-76d0-4dce-9983-4b07655adc25-kube-api-access-c8b85\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.151842 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="439b686e-927d-425a-a218-807220ae1e95" path="/var/lib/kubelet/pods/439b686e-927d-425a-a218-807220ae1e95/volumes" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.152559 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.155549 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" path="/var/lib/kubelet/pods/a9d3606c-b28d-4028-93fc-535afa127cd6/volumes" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.324960 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546522 4816 generic.go:334] "Generic (PLEG): container finished" podID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerID="2482598d21f5c9ee2cffe3291bcb032276779844be127265c434d4ee3a10dd01" exitCode=0 Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546557 4816 generic.go:334] "Generic (PLEG): container finished" podID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerID="caf5f5963ed5f620c8712cea969e0fcf607060ab3626bd9f71ed7c1f2fef14cd" exitCode=2 Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546564 4816 generic.go:334] "Generic (PLEG): container finished" podID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerID="09ea84d41bea9be23219e7a701b78afcc81f9e1c777303de3089f54128f5a641" exitCode=0 Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546573 4816 generic.go:334] "Generic (PLEG): container finished" podID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerID="29fe9c7ac5f65d3d19f417a0d611bc3a79cf763f7ef21444af5a797e56f3f63f" exitCode=0 Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546615 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerDied","Data":"2482598d21f5c9ee2cffe3291bcb032276779844be127265c434d4ee3a10dd01"} Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546649 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerDied","Data":"caf5f5963ed5f620c8712cea969e0fcf607060ab3626bd9f71ed7c1f2fef14cd"} Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546661 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerDied","Data":"09ea84d41bea9be23219e7a701b78afcc81f9e1c777303de3089f54128f5a641"} Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546670 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerDied","Data":"29fe9c7ac5f65d3d19f417a0d611bc3a79cf763f7ef21444af5a797e56f3f63f"} Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.736058 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:20:08 crc kubenswrapper[4816]: W0311 12:20:08.741415 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7457f2db_7979_4d92_bd90_a1464b8a3878.slice/crio-722d37999c6fc7f3ffe4d8bb991503dcb67968fd32e0b13507be34c65c4fb635 WatchSource:0}: Error finding container 722d37999c6fc7f3ffe4d8bb991503dcb67968fd32e0b13507be34c65c4fb635: Status 404 returned error can't find the container with id 722d37999c6fc7f3ffe4d8bb991503dcb67968fd32e0b13507be34c65c4fb635 Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.902371 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.044739 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-combined-ca-bundle\") pod \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.044839 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc2gb\" (UniqueName: \"kubernetes.io/projected/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-kube-api-access-wc2gb\") pod \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.044900 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-log-httpd\") pod \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.045035 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-run-httpd\") pod \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.045073 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-config-data\") pod \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.045091 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-sg-core-conf-yaml\") pod \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.045119 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-scripts\") pod \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.045732 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" (UID: "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.046559 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.052641 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" (UID: "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.074917 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-scripts" (OuterVolumeSpecName: "scripts") pod "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" (UID: "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.098060 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" (UID: "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.137501 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-kube-api-access-wc2gb" (OuterVolumeSpecName: "kube-api-access-wc2gb") pod "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" (UID: "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49"). InnerVolumeSpecName "kube-api-access-wc2gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.148975 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.149031 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.149046 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.149057 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc2gb\" (UniqueName: \"kubernetes.io/projected/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-kube-api-access-wc2gb\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.170914 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.179184 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" (UID: "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:09 crc kubenswrapper[4816]: W0311 12:20:09.183505 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode95ddca0_76d0_4dce_9983_4b07655adc25.slice/crio-f13badcbc5010cfb4035a99958d3aebf412aeabedcc2f776bca112d761fa63de WatchSource:0}: Error finding container f13badcbc5010cfb4035a99958d3aebf412aeabedcc2f776bca112d761fa63de: Status 404 returned error can't find the container with id f13badcbc5010cfb4035a99958d3aebf412aeabedcc2f776bca112d761fa63de Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.240501 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-config-data" (OuterVolumeSpecName: "config-data") pod "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" (UID: "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.251617 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.251825 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.581465 4816 generic.go:334] "Generic (PLEG): container finished" podID="4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9" containerID="7074db26ba14c2f5793b32a499e15ff64a76fc4764f04041e3b7367e813d1eb6" exitCode=0 Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.581871 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553860-9kp4n" event={"ID":"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9","Type":"ContainerDied","Data":"7074db26ba14c2f5793b32a499e15ff64a76fc4764f04041e3b7367e813d1eb6"} Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.585574 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7457f2db-7979-4d92-bd90-a1464b8a3878","Type":"ContainerStarted","Data":"c020c8caff09b112c5e61167611361a425a1b4a92367fbbd7dbf97390e021cca"} Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.585604 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7457f2db-7979-4d92-bd90-a1464b8a3878","Type":"ContainerStarted","Data":"722d37999c6fc7f3ffe4d8bb991503dcb67968fd32e0b13507be34c65c4fb635"} Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.593509 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerDied","Data":"bab09cd01583eebdccfb229a37532b6f0674000f5d1606f07d42c8adaa348948"} Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.593559 4816 scope.go:117] "RemoveContainer" containerID="2482598d21f5c9ee2cffe3291bcb032276779844be127265c434d4ee3a10dd01" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.593679 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.610164 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e95ddca0-76d0-4dce-9983-4b07655adc25","Type":"ContainerStarted","Data":"f13badcbc5010cfb4035a99958d3aebf412aeabedcc2f776bca112d761fa63de"} Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.640214 4816 scope.go:117] "RemoveContainer" containerID="caf5f5963ed5f620c8712cea969e0fcf607060ab3626bd9f71ed7c1f2fef14cd" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.641978 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.652986 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.672200 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:09 crc kubenswrapper[4816]: E0311 12:20:09.672774 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-central-agent" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.672798 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-central-agent" Mar 11 12:20:09 crc kubenswrapper[4816]: E0311 12:20:09.672819 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-notification-agent" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.672829 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-notification-agent" Mar 11 12:20:09 crc kubenswrapper[4816]: E0311 12:20:09.672848 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="proxy-httpd" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.672857 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="proxy-httpd" Mar 11 12:20:09 crc kubenswrapper[4816]: E0311 12:20:09.672877 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="sg-core" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.672883 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="sg-core" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.673058 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="sg-core" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.673081 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-notification-agent" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.673092 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="proxy-httpd" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.673104 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-central-agent" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.675486 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.680497 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.680520 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.698673 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766318 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-config-data\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766441 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766550 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-log-httpd\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766610 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-scripts\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766665 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgmms\" (UniqueName: \"kubernetes.io/projected/6387790e-663e-4746-9e9f-250ac4a06535-kube-api-access-tgmms\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766724 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-run-httpd\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766765 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766957 4816 scope.go:117] "RemoveContainer" containerID="09ea84d41bea9be23219e7a701b78afcc81f9e1c777303de3089f54128f5a641" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.834552 4816 scope.go:117] "RemoveContainer" containerID="29fe9c7ac5f65d3d19f417a0d611bc3a79cf763f7ef21444af5a797e56f3f63f" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.874070 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-config-data\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.874145 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.874188 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-log-httpd\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.874213 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-scripts\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.874238 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgmms\" (UniqueName: \"kubernetes.io/projected/6387790e-663e-4746-9e9f-250ac4a06535-kube-api-access-tgmms\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.874290 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-run-httpd\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.874311 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.875745 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-log-httpd\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.876024 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-run-httpd\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.883478 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.884813 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.887381 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-config-data\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.889386 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-scripts\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.896314 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgmms\" (UniqueName: \"kubernetes.io/projected/6387790e-663e-4746-9e9f-250ac4a06535-kube-api-access-tgmms\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.042367 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.141485 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" path="/var/lib/kubelet/pods/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49/volumes" Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.599263 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:10 crc kubenswrapper[4816]: W0311 12:20:10.606090 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6387790e_663e_4746_9e9f_250ac4a06535.slice/crio-6118c1f938ca09df069e1d25ab77da2d96c4cf88ac9d852f756ce969121a9a79 WatchSource:0}: Error finding container 6118c1f938ca09df069e1d25ab77da2d96c4cf88ac9d852f756ce969121a9a79: Status 404 returned error can't find the container with id 6118c1f938ca09df069e1d25ab77da2d96c4cf88ac9d852f756ce969121a9a79 Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.661431 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e95ddca0-76d0-4dce-9983-4b07655adc25","Type":"ContainerStarted","Data":"9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764"} Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.661486 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e95ddca0-76d0-4dce-9983-4b07655adc25","Type":"ContainerStarted","Data":"c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25"} Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.667652 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerStarted","Data":"6118c1f938ca09df069e1d25ab77da2d96c4cf88ac9d852f756ce969121a9a79"} Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.671587 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7457f2db-7979-4d92-bd90-a1464b8a3878","Type":"ContainerStarted","Data":"8ba3c9d212f5a9f10887e454eabe42340558258c07c8285eb982b69803aa3749"} Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.708082 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.708055894 podStartE2EDuration="3.708055894s" podCreationTimestamp="2026-03-11 12:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:10.684265575 +0000 UTC m=+1297.275529542" watchObservedRunningTime="2026-03-11 12:20:10.708055894 +0000 UTC m=+1297.299319861" Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.719698 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.719677126 podStartE2EDuration="3.719677126s" podCreationTimestamp="2026-03-11 12:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:10.71457042 +0000 UTC m=+1297.305834397" watchObservedRunningTime="2026-03-11 12:20:10.719677126 +0000 UTC m=+1297.310941093" Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.069504 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553860-9kp4n" Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.205473 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdcqf\" (UniqueName: \"kubernetes.io/projected/4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9-kube-api-access-wdcqf\") pod \"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9\" (UID: \"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9\") " Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.223516 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9-kube-api-access-wdcqf" (OuterVolumeSpecName: "kube-api-access-wdcqf") pod "4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9" (UID: "4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9"). InnerVolumeSpecName "kube-api-access-wdcqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.307712 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdcqf\" (UniqueName: \"kubernetes.io/projected/4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9-kube-api-access-wdcqf\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.684827 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerStarted","Data":"a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45"} Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.688400 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553860-9kp4n" Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.688412 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553860-9kp4n" event={"ID":"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9","Type":"ContainerDied","Data":"945268d86ba621c8fa8980dff5e43070ffe204d163fdf1c51439bce4ca2b4338"} Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.688520 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="945268d86ba621c8fa8980dff5e43070ffe204d163fdf1c51439bce4ca2b4338" Mar 11 12:20:12 crc kubenswrapper[4816]: I0311 12:20:12.161892 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553854-hbf96"] Mar 11 12:20:12 crc kubenswrapper[4816]: I0311 12:20:12.173815 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553854-hbf96"] Mar 11 12:20:12 crc kubenswrapper[4816]: I0311 12:20:12.539213 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:12 crc kubenswrapper[4816]: I0311 12:20:12.700488 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerStarted","Data":"1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6"} Mar 11 12:20:13 crc kubenswrapper[4816]: I0311 12:20:13.714701 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerStarted","Data":"9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef"} Mar 11 12:20:14 crc kubenswrapper[4816]: I0311 12:20:14.147305 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af8a107b-6295-42d4-b64b-7841171f67f3" path="/var/lib/kubelet/pods/af8a107b-6295-42d4-b64b-7841171f67f3/volumes" Mar 11 12:20:15 crc kubenswrapper[4816]: I0311 12:20:15.745566 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerStarted","Data":"50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792"} Mar 11 12:20:15 crc kubenswrapper[4816]: I0311 12:20:15.746514 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:20:15 crc kubenswrapper[4816]: I0311 12:20:15.745909 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-central-agent" containerID="cri-o://a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45" gracePeriod=30 Mar 11 12:20:15 crc kubenswrapper[4816]: I0311 12:20:15.746143 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="proxy-httpd" containerID="cri-o://50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792" gracePeriod=30 Mar 11 12:20:15 crc kubenswrapper[4816]: I0311 12:20:15.746119 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="sg-core" containerID="cri-o://9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef" gracePeriod=30 Mar 11 12:20:15 crc kubenswrapper[4816]: I0311 12:20:15.746158 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-notification-agent" containerID="cri-o://1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6" gracePeriod=30 Mar 11 12:20:15 crc kubenswrapper[4816]: I0311 12:20:15.786083 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.280106551 podStartE2EDuration="6.786057453s" podCreationTimestamp="2026-03-11 12:20:09 +0000 UTC" firstStartedPulling="2026-03-11 12:20:10.608432318 +0000 UTC m=+1297.199696285" lastFinishedPulling="2026-03-11 12:20:15.11438321 +0000 UTC m=+1301.705647187" observedRunningTime="2026-03-11 12:20:15.775418739 +0000 UTC m=+1302.366682706" watchObservedRunningTime="2026-03-11 12:20:15.786057453 +0000 UTC m=+1302.377321420" Mar 11 12:20:16 crc kubenswrapper[4816]: I0311 12:20:16.759919 4816 generic.go:334] "Generic (PLEG): container finished" podID="6387790e-663e-4746-9e9f-250ac4a06535" containerID="50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792" exitCode=0 Mar 11 12:20:16 crc kubenswrapper[4816]: I0311 12:20:16.759970 4816 generic.go:334] "Generic (PLEG): container finished" podID="6387790e-663e-4746-9e9f-250ac4a06535" containerID="9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef" exitCode=2 Mar 11 12:20:16 crc kubenswrapper[4816]: I0311 12:20:16.759981 4816 generic.go:334] "Generic (PLEG): container finished" podID="6387790e-663e-4746-9e9f-250ac4a06535" containerID="1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6" exitCode=0 Mar 11 12:20:16 crc kubenswrapper[4816]: I0311 12:20:16.760008 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerDied","Data":"50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792"} Mar 11 12:20:16 crc kubenswrapper[4816]: I0311 12:20:16.760067 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerDied","Data":"9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef"} Mar 11 12:20:16 crc kubenswrapper[4816]: I0311 12:20:16.760084 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerDied","Data":"1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6"} Mar 11 12:20:17 crc kubenswrapper[4816]: I0311 12:20:17.887523 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 12:20:17 crc kubenswrapper[4816]: I0311 12:20:17.888073 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 12:20:17 crc kubenswrapper[4816]: I0311 12:20:17.931957 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 12:20:17 crc kubenswrapper[4816]: I0311 12:20:17.939343 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.327178 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.327317 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.362169 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.400777 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.665601 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.785577 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-sg-core-conf-yaml\") pod \"6387790e-663e-4746-9e9f-250ac4a06535\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.785760 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-scripts\") pod \"6387790e-663e-4746-9e9f-250ac4a06535\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.786963 4816 generic.go:334] "Generic (PLEG): container finished" podID="6268fe92-5c93-43c7-95bc-f30befda5d65" containerID="f424c56cb69a088a064cc5d2e599b7db758b5e10ecf876c1586a8816a4d96acd" exitCode=0 Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.786988 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-config-data\") pod \"6387790e-663e-4746-9e9f-250ac4a06535\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.787078 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" event={"ID":"6268fe92-5c93-43c7-95bc-f30befda5d65","Type":"ContainerDied","Data":"f424c56cb69a088a064cc5d2e599b7db758b5e10ecf876c1586a8816a4d96acd"} Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.787126 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgmms\" (UniqueName: \"kubernetes.io/projected/6387790e-663e-4746-9e9f-250ac4a06535-kube-api-access-tgmms\") pod \"6387790e-663e-4746-9e9f-250ac4a06535\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.787223 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-log-httpd\") pod \"6387790e-663e-4746-9e9f-250ac4a06535\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.787317 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-combined-ca-bundle\") pod \"6387790e-663e-4746-9e9f-250ac4a06535\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.787358 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-run-httpd\") pod \"6387790e-663e-4746-9e9f-250ac4a06535\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.787821 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6387790e-663e-4746-9e9f-250ac4a06535" (UID: "6387790e-663e-4746-9e9f-250ac4a06535"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.787869 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6387790e-663e-4746-9e9f-250ac4a06535" (UID: "6387790e-663e-4746-9e9f-250ac4a06535"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.788151 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.788172 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.794561 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6387790e-663e-4746-9e9f-250ac4a06535-kube-api-access-tgmms" (OuterVolumeSpecName: "kube-api-access-tgmms") pod "6387790e-663e-4746-9e9f-250ac4a06535" (UID: "6387790e-663e-4746-9e9f-250ac4a06535"). InnerVolumeSpecName "kube-api-access-tgmms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.795365 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-scripts" (OuterVolumeSpecName: "scripts") pod "6387790e-663e-4746-9e9f-250ac4a06535" (UID: "6387790e-663e-4746-9e9f-250ac4a06535"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.810422 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.810491 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerDied","Data":"a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45"} Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.810607 4816 scope.go:117] "RemoveContainer" containerID="50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.810207 4816 generic.go:334] "Generic (PLEG): container finished" podID="6387790e-663e-4746-9e9f-250ac4a06535" containerID="a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45" exitCode=0 Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.813446 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerDied","Data":"6118c1f938ca09df069e1d25ab77da2d96c4cf88ac9d852f756ce969121a9a79"} Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.815628 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.815670 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.815684 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.815693 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.832734 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6387790e-663e-4746-9e9f-250ac4a06535" (UID: "6387790e-663e-4746-9e9f-250ac4a06535"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.890810 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.891025 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.891043 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgmms\" (UniqueName: \"kubernetes.io/projected/6387790e-663e-4746-9e9f-250ac4a06535-kube-api-access-tgmms\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.903632 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-config-data" (OuterVolumeSpecName: "config-data") pod "6387790e-663e-4746-9e9f-250ac4a06535" (UID: "6387790e-663e-4746-9e9f-250ac4a06535"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.910854 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6387790e-663e-4746-9e9f-250ac4a06535" (UID: "6387790e-663e-4746-9e9f-250ac4a06535"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.939075 4816 scope.go:117] "RemoveContainer" containerID="9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.976690 4816 scope.go:117] "RemoveContainer" containerID="1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.993758 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.993794 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.011430 4816 scope.go:117] "RemoveContainer" containerID="a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.053468 4816 scope.go:117] "RemoveContainer" containerID="50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.054748 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792\": container with ID starting with 50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792 not found: ID does not exist" containerID="50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.054811 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792"} err="failed to get container status \"50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792\": rpc error: code = NotFound desc = could not find container \"50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792\": container with ID starting with 50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792 not found: ID does not exist" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.054850 4816 scope.go:117] "RemoveContainer" containerID="9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.055625 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef\": container with ID starting with 9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef not found: ID does not exist" containerID="9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.055652 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef"} err="failed to get container status \"9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef\": rpc error: code = NotFound desc = could not find container \"9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef\": container with ID starting with 9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef not found: ID does not exist" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.055668 4816 scope.go:117] "RemoveContainer" containerID="1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.056445 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6\": container with ID starting with 1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6 not found: ID does not exist" containerID="1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.056494 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6"} err="failed to get container status \"1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6\": rpc error: code = NotFound desc = could not find container \"1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6\": container with ID starting with 1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6 not found: ID does not exist" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.056513 4816 scope.go:117] "RemoveContainer" containerID="a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.057065 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45\": container with ID starting with a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45 not found: ID does not exist" containerID="a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.057134 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45"} err="failed to get container status \"a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45\": rpc error: code = NotFound desc = could not find container \"a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45\": container with ID starting with a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45 not found: ID does not exist" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.164138 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.175111 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.211551 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.212046 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-notification-agent" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212071 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-notification-agent" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.212101 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="proxy-httpd" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212109 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="proxy-httpd" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.212123 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9" containerName="oc" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212131 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9" containerName="oc" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.212142 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="sg-core" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212149 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="sg-core" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.212166 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-central-agent" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212174 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-central-agent" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212377 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="proxy-httpd" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212392 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9" containerName="oc" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212411 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-central-agent" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212423 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-notification-agent" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212431 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="sg-core" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.214498 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.218817 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.219068 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.231209 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.404681 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-run-httpd\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.404753 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-config-data\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.404789 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-log-httpd\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.404859 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8dwr\" (UniqueName: \"kubernetes.io/projected/1a9b124c-68d8-44e9-9381-fa448155ef23-kube-api-access-d8dwr\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.404879 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.404922 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.405313 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-scripts\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.507504 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-config-data\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.507564 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-log-httpd\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.507590 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8dwr\" (UniqueName: \"kubernetes.io/projected/1a9b124c-68d8-44e9-9381-fa448155ef23-kube-api-access-d8dwr\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.507615 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.507645 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.507695 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-scripts\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.507758 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-run-httpd\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.509003 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-log-httpd\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.509020 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-run-httpd\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.513094 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-scripts\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.512411 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.513801 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-config-data\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.514568 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.530526 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8dwr\" (UniqueName: \"kubernetes.io/projected/1a9b124c-68d8-44e9-9381-fa448155ef23-kube-api-access-d8dwr\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.554926 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.108116 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.145911 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6387790e-663e-4746-9e9f-250ac4a06535" path="/var/lib/kubelet/pods/6387790e-663e-4746-9e9f-250ac4a06535/volumes" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.175459 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.328720 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-scripts\") pod \"6268fe92-5c93-43c7-95bc-f30befda5d65\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.328931 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpt2w\" (UniqueName: \"kubernetes.io/projected/6268fe92-5c93-43c7-95bc-f30befda5d65-kube-api-access-zpt2w\") pod \"6268fe92-5c93-43c7-95bc-f30befda5d65\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.329003 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-combined-ca-bundle\") pod \"6268fe92-5c93-43c7-95bc-f30befda5d65\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.329045 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-config-data\") pod \"6268fe92-5c93-43c7-95bc-f30befda5d65\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.336186 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-scripts" (OuterVolumeSpecName: "scripts") pod "6268fe92-5c93-43c7-95bc-f30befda5d65" (UID: "6268fe92-5c93-43c7-95bc-f30befda5d65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.341402 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6268fe92-5c93-43c7-95bc-f30befda5d65-kube-api-access-zpt2w" (OuterVolumeSpecName: "kube-api-access-zpt2w") pod "6268fe92-5c93-43c7-95bc-f30befda5d65" (UID: "6268fe92-5c93-43c7-95bc-f30befda5d65"). InnerVolumeSpecName "kube-api-access-zpt2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.361032 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6268fe92-5c93-43c7-95bc-f30befda5d65" (UID: "6268fe92-5c93-43c7-95bc-f30befda5d65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.376813 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-config-data" (OuterVolumeSpecName: "config-data") pod "6268fe92-5c93-43c7-95bc-f30befda5d65" (UID: "6268fe92-5c93-43c7-95bc-f30befda5d65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.433490 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.433535 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpt2w\" (UniqueName: \"kubernetes.io/projected/6268fe92-5c93-43c7-95bc-f30befda5d65-kube-api-access-zpt2w\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.433550 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.433561 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.860042 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerStarted","Data":"38b868cc185bf2881b9763f9f27568b608cb3091bc38e885e64b2566d5c8d41e"} Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.867312 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.867344 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.868530 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.874809 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" event={"ID":"6268fe92-5c93-43c7-95bc-f30befda5d65","Type":"ContainerDied","Data":"1939b4a789439a60faf8174002db0eb6692620810e7f2821d03a1a8fa9509b1e"} Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.874875 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1939b4a789439a60faf8174002db0eb6692620810e7f2821d03a1a8fa9509b1e" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.946056 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.951392 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.986063 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.986498 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.991008 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.149409 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 12:20:21 crc kubenswrapper[4816]: E0311 12:20:21.150478 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6268fe92-5c93-43c7-95bc-f30befda5d65" containerName="nova-cell0-conductor-db-sync" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.150496 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6268fe92-5c93-43c7-95bc-f30befda5d65" containerName="nova-cell0-conductor-db-sync" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.150695 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6268fe92-5c93-43c7-95bc-f30befda5d65" containerName="nova-cell0-conductor-db-sync" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.151469 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.154097 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.158608 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-f2q4d" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.161376 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.254310 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.254584 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ml65\" (UniqueName: \"kubernetes.io/projected/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-kube-api-access-4ml65\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.254713 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.357157 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.357332 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.357371 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ml65\" (UniqueName: \"kubernetes.io/projected/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-kube-api-access-4ml65\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.363422 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.365857 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.384898 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ml65\" (UniqueName: \"kubernetes.io/projected/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-kube-api-access-4ml65\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.475028 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.886944 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerStarted","Data":"cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74"} Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.887516 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerStarted","Data":"3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b"} Mar 11 12:20:21 crc kubenswrapper[4816]: W0311 12:20:21.986693 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9eb0dee_5bdb_4ca4_a746_d33e8b7d20cc.slice/crio-897a415294b966ad7eb32e075c662fce4ade523bc49b487efdfde948eb76f843 WatchSource:0}: Error finding container 897a415294b966ad7eb32e075c662fce4ade523bc49b487efdfde948eb76f843: Status 404 returned error can't find the container with id 897a415294b966ad7eb32e075c662fce4ade523bc49b487efdfde948eb76f843 Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.991419 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 12:20:22 crc kubenswrapper[4816]: I0311 12:20:22.895327 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc","Type":"ContainerStarted","Data":"4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2"} Mar 11 12:20:22 crc kubenswrapper[4816]: I0311 12:20:22.896291 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc","Type":"ContainerStarted","Data":"897a415294b966ad7eb32e075c662fce4ade523bc49b487efdfde948eb76f843"} Mar 11 12:20:22 crc kubenswrapper[4816]: I0311 12:20:22.922789 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.922763106 podStartE2EDuration="1.922763106s" podCreationTimestamp="2026-03-11 12:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:22.918664619 +0000 UTC m=+1309.509928586" watchObservedRunningTime="2026-03-11 12:20:22.922763106 +0000 UTC m=+1309.514027073" Mar 11 12:20:23 crc kubenswrapper[4816]: I0311 12:20:23.910721 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerStarted","Data":"91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818"} Mar 11 12:20:23 crc kubenswrapper[4816]: I0311 12:20:23.911188 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:25 crc kubenswrapper[4816]: I0311 12:20:25.936197 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerStarted","Data":"46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c"} Mar 11 12:20:25 crc kubenswrapper[4816]: I0311 12:20:25.939094 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:20:25 crc kubenswrapper[4816]: I0311 12:20:25.973562 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.962653178 podStartE2EDuration="6.973539039s" podCreationTimestamp="2026-03-11 12:20:19 +0000 UTC" firstStartedPulling="2026-03-11 12:20:20.118634381 +0000 UTC m=+1306.709898348" lastFinishedPulling="2026-03-11 12:20:25.129520242 +0000 UTC m=+1311.720784209" observedRunningTime="2026-03-11 12:20:25.97045063 +0000 UTC m=+1312.561714597" watchObservedRunningTime="2026-03-11 12:20:25.973539039 +0000 UTC m=+1312.564803006" Mar 11 12:20:31 crc kubenswrapper[4816]: I0311 12:20:31.517937 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.086341 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-qt9tz"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.088517 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.091703 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.091931 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.101000 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qt9tz"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.134660 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs7hr\" (UniqueName: \"kubernetes.io/projected/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-kube-api-access-fs7hr\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.134752 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.134785 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-config-data\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.135017 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-scripts\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.239594 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-scripts\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.239865 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs7hr\" (UniqueName: \"kubernetes.io/projected/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-kube-api-access-fs7hr\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.239943 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.239973 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-config-data\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.249558 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-config-data\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.260093 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.297899 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-scripts\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.438619 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.440821 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.467028 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.469768 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.479459 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.479569 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.549177 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.549779 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.549811 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.549831 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-config-data\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.549878 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlcbd\" (UniqueName: \"kubernetes.io/projected/78b435e0-53bf-4f8c-aef9-49b170fc9519-kube-api-access-wlcbd\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.549895 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4xh6\" (UniqueName: \"kubernetes.io/projected/33811121-46de-4941-bd74-18ecaa2c2827-kube-api-access-t4xh6\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.549910 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b435e0-53bf-4f8c-aef9-49b170fc9519-logs\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.611306 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.626184 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.626364 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.631437 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650606 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650668 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650692 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-config-data\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650715 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsrdn\" (UniqueName: \"kubernetes.io/projected/1966e6cd-d10e-468d-9e4f-7484f67202b4-kube-api-access-nsrdn\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650765 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlcbd\" (UniqueName: \"kubernetes.io/projected/78b435e0-53bf-4f8c-aef9-49b170fc9519-kube-api-access-wlcbd\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650789 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4xh6\" (UniqueName: \"kubernetes.io/projected/33811121-46de-4941-bd74-18ecaa2c2827-kube-api-access-t4xh6\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650805 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b435e0-53bf-4f8c-aef9-49b170fc9519-logs\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650836 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650869 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1966e6cd-d10e-468d-9e4f-7484f67202b4-logs\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650915 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650938 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-config-data\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.655040 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b435e0-53bf-4f8c-aef9-49b170fc9519-logs\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.668087 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.668130 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.668451 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.668565 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-config-data\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.687429 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.713193 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.752425 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsrdn\" (UniqueName: \"kubernetes.io/projected/1966e6cd-d10e-468d-9e4f-7484f67202b4-kube-api-access-nsrdn\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.752544 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1966e6cd-d10e-468d-9e4f-7484f67202b4-logs\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.752591 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.752613 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-config-data\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.754990 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1966e6cd-d10e-468d-9e4f-7484f67202b4-logs\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.764509 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.773283 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-config-data\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.797142 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlcbd\" (UniqueName: \"kubernetes.io/projected/78b435e0-53bf-4f8c-aef9-49b170fc9519-kube-api-access-wlcbd\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.797895 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4xh6\" (UniqueName: \"kubernetes.io/projected/33811121-46de-4941-bd74-18ecaa2c2827-kube-api-access-t4xh6\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.809888 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.820979 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs7hr\" (UniqueName: \"kubernetes.io/projected/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-kube-api-access-fs7hr\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.827359 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsrdn\" (UniqueName: \"kubernetes.io/projected/1966e6cd-d10e-468d-9e4f-7484f67202b4-kube-api-access-nsrdn\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.905316 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.906940 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.910231 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.915887 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.937440 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69b4446475-bsnbn"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.939135 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.953819 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.980397 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-bsnbn"] Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.013021 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.065262 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.065322 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.065385 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fdrt\" (UniqueName: \"kubernetes.io/projected/05937283-8ec7-430d-be71-c968e8e97ff1-kube-api-access-7fdrt\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.065467 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgzdl\" (UniqueName: \"kubernetes.io/projected/1370549e-42a3-450d-a28d-47d4a0764f56-kube-api-access-jgzdl\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.065694 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.065835 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.065922 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-svc\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.066011 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-config\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.066128 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-config-data\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.091675 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.168650 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.168743 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-svc\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.168795 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-config\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.168847 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-config-data\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.168887 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.168915 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.168973 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fdrt\" (UniqueName: \"kubernetes.io/projected/05937283-8ec7-430d-be71-c968e8e97ff1-kube-api-access-7fdrt\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.169061 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgzdl\" (UniqueName: \"kubernetes.io/projected/1370549e-42a3-450d-a28d-47d4a0764f56-kube-api-access-jgzdl\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.169108 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.170589 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-svc\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.170662 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.171105 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.171660 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-config\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.172028 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.174417 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.174471 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-config-data\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.189835 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fdrt\" (UniqueName: \"kubernetes.io/projected/05937283-8ec7-430d-be71-c968e8e97ff1-kube-api-access-7fdrt\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.190670 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgzdl\" (UniqueName: \"kubernetes.io/projected/1370549e-42a3-450d-a28d-47d4a0764f56-kube-api-access-jgzdl\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.257861 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.276656 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.562651 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdblc"] Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.567388 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.576131 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.576414 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.606108 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdblc"] Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.687113 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66txc\" (UniqueName: \"kubernetes.io/projected/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-kube-api-access-66txc\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.687931 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-scripts\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.687987 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.688281 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-config-data\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.790804 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-scripts\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.792234 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.792311 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-config-data\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.792629 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66txc\" (UniqueName: \"kubernetes.io/projected/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-kube-api-access-66txc\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.805488 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-scripts\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.805939 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-config-data\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.809325 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.831769 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66txc\" (UniqueName: \"kubernetes.io/projected/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-kube-api-access-66txc\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.984913 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:34 crc kubenswrapper[4816]: I0311 12:20:34.026176 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:20:34 crc kubenswrapper[4816]: I0311 12:20:34.126214 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:34 crc kubenswrapper[4816]: W0311 12:20:34.276576 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05937283_8ec7_430d_be71_c968e8e97ff1.slice/crio-20c9b17fece99c2795c0705a485bb232956e3243ab02757479fd7900fae5c7a8 WatchSource:0}: Error finding container 20c9b17fece99c2795c0705a485bb232956e3243ab02757479fd7900fae5c7a8: Status 404 returned error can't find the container with id 20c9b17fece99c2795c0705a485bb232956e3243ab02757479fd7900fae5c7a8 Mar 11 12:20:34 crc kubenswrapper[4816]: I0311 12:20:34.293227 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:34 crc kubenswrapper[4816]: I0311 12:20:34.349160 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qt9tz"] Mar 11 12:20:34 crc kubenswrapper[4816]: I0311 12:20:34.380806 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:34 crc kubenswrapper[4816]: I0311 12:20:34.392278 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-bsnbn"] Mar 11 12:20:34 crc kubenswrapper[4816]: W0311 12:20:34.422152 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1370549e_42a3_450d_a28d_47d4a0764f56.slice/crio-31af9c3f8588e04ef95c2018485a2e29382e231f1e73609996971ecefe64ea0d WatchSource:0}: Error finding container 31af9c3f8588e04ef95c2018485a2e29382e231f1e73609996971ecefe64ea0d: Status 404 returned error can't find the container with id 31af9c3f8588e04ef95c2018485a2e29382e231f1e73609996971ecefe64ea0d Mar 11 12:20:34 crc kubenswrapper[4816]: I0311 12:20:34.647363 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdblc"] Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.054823 4816 generic.go:334] "Generic (PLEG): container finished" podID="1370549e-42a3-450d-a28d-47d4a0764f56" containerID="73c8dd7cd36356a8399521ca85923fa5bea70d1a67253cbf2ad9c716aae771dd" exitCode=0 Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.054955 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" event={"ID":"1370549e-42a3-450d-a28d-47d4a0764f56","Type":"ContainerDied","Data":"73c8dd7cd36356a8399521ca85923fa5bea70d1a67253cbf2ad9c716aae771dd"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.055239 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" event={"ID":"1370549e-42a3-450d-a28d-47d4a0764f56","Type":"ContainerStarted","Data":"31af9c3f8588e04ef95c2018485a2e29382e231f1e73609996971ecefe64ea0d"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.061641 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wdblc" event={"ID":"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac","Type":"ContainerStarted","Data":"393812c2bcecafdfa2f0e8dd848197ee6392877cb2e73b5a2b5b12d642a2ed5c"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.061693 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wdblc" event={"ID":"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac","Type":"ContainerStarted","Data":"f816942d05a050c01aa9ba2c41f5b875b2c28a8b75fac93c9319285524d0649e"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.066150 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1966e6cd-d10e-468d-9e4f-7484f67202b4","Type":"ContainerStarted","Data":"8b2de9f5a79740ef68c68b7f57f77e1e4a7c7e1b2cf9e2d47a43242ce1a5d655"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.071806 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qt9tz" event={"ID":"3f519dc2-e88b-4e4b-9637-c3e172b81bfa","Type":"ContainerStarted","Data":"7499f2a9acd657b210b2f77e2cefe97fa749ba96e868296d76960eaa9ed38ee8"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.071846 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qt9tz" event={"ID":"3f519dc2-e88b-4e4b-9637-c3e172b81bfa","Type":"ContainerStarted","Data":"f5b3dcc0252b5364b53dcccbdc300f4832e72947bf82b5958b6e65ae5b0eac60"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.086021 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78b435e0-53bf-4f8c-aef9-49b170fc9519","Type":"ContainerStarted","Data":"0ff4ead2a33e6228002ecd5e8665db969fa8aef2732966d20f7187976e9cf4b6"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.088600 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05937283-8ec7-430d-be71-c968e8e97ff1","Type":"ContainerStarted","Data":"20c9b17fece99c2795c0705a485bb232956e3243ab02757479fd7900fae5c7a8"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.092469 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"33811121-46de-4941-bd74-18ecaa2c2827","Type":"ContainerStarted","Data":"cc7188d0b18641404663ed171ec3812667a2d4778de79e666b89e8d42f9ec1e9"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.124216 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-qt9tz" podStartSLOduration=3.124183157 podStartE2EDuration="3.124183157s" podCreationTimestamp="2026-03-11 12:20:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:35.103752473 +0000 UTC m=+1321.695016450" watchObservedRunningTime="2026-03-11 12:20:35.124183157 +0000 UTC m=+1321.715447124" Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.141433 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wdblc" podStartSLOduration=2.141407599 podStartE2EDuration="2.141407599s" podCreationTimestamp="2026-03-11 12:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:35.132132094 +0000 UTC m=+1321.723396071" watchObservedRunningTime="2026-03-11 12:20:35.141407599 +0000 UTC m=+1321.732671566" Mar 11 12:20:36 crc kubenswrapper[4816]: I0311 12:20:36.984467 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:36 crc kubenswrapper[4816]: I0311 12:20:36.995500 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.181826 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1966e6cd-d10e-468d-9e4f-7484f67202b4","Type":"ContainerStarted","Data":"fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af"} Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.195288 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78b435e0-53bf-4f8c-aef9-49b170fc9519","Type":"ContainerStarted","Data":"03715e3af42eb6ce517cc3fbf23a4e260b58184bdafa92056416df870db5e907"} Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.200345 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05937283-8ec7-430d-be71-c968e8e97ff1","Type":"ContainerStarted","Data":"e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42"} Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.205460 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="33811121-46de-4941-bd74-18ecaa2c2827" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0" gracePeriod=30 Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.205543 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"33811121-46de-4941-bd74-18ecaa2c2827","Type":"ContainerStarted","Data":"63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0"} Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.210294 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" event={"ID":"1370549e-42a3-450d-a28d-47d4a0764f56","Type":"ContainerStarted","Data":"c5bd6af011971fbc8f1597775b66953c97da9697d9d038a8ec30a1201d7a28f6"} Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.210565 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.228988 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.896405128 podStartE2EDuration="6.228968563s" podCreationTimestamp="2026-03-11 12:20:32 +0000 UTC" firstStartedPulling="2026-03-11 12:20:34.280304074 +0000 UTC m=+1320.871568041" lastFinishedPulling="2026-03-11 12:20:37.612867499 +0000 UTC m=+1324.204131476" observedRunningTime="2026-03-11 12:20:38.22431514 +0000 UTC m=+1324.815579107" watchObservedRunningTime="2026-03-11 12:20:38.228968563 +0000 UTC m=+1324.820232530" Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.258192 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.260446 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" podStartSLOduration=6.260422541 podStartE2EDuration="6.260422541s" podCreationTimestamp="2026-03-11 12:20:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:38.249951982 +0000 UTC m=+1324.841215949" watchObservedRunningTime="2026-03-11 12:20:38.260422541 +0000 UTC m=+1324.851686508" Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.282481 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.710712743 podStartE2EDuration="6.282451311s" podCreationTimestamp="2026-03-11 12:20:32 +0000 UTC" firstStartedPulling="2026-03-11 12:20:34.040711499 +0000 UTC m=+1320.631975466" lastFinishedPulling="2026-03-11 12:20:37.612450057 +0000 UTC m=+1324.203714034" observedRunningTime="2026-03-11 12:20:38.274524464 +0000 UTC m=+1324.865788431" watchObservedRunningTime="2026-03-11 12:20:38.282451311 +0000 UTC m=+1324.873715278" Mar 11 12:20:39 crc kubenswrapper[4816]: I0311 12:20:39.224185 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1966e6cd-d10e-468d-9e4f-7484f67202b4","Type":"ContainerStarted","Data":"23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e"} Mar 11 12:20:39 crc kubenswrapper[4816]: I0311 12:20:39.224350 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-log" containerID="cri-o://fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af" gracePeriod=30 Mar 11 12:20:39 crc kubenswrapper[4816]: I0311 12:20:39.224455 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-metadata" containerID="cri-o://23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e" gracePeriod=30 Mar 11 12:20:39 crc kubenswrapper[4816]: I0311 12:20:39.227798 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78b435e0-53bf-4f8c-aef9-49b170fc9519","Type":"ContainerStarted","Data":"40e1356e9811fbd25787534e33a830de5168253a4c4ac07e43758ab056b21a5a"} Mar 11 12:20:39 crc kubenswrapper[4816]: I0311 12:20:39.266543 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.819882126 podStartE2EDuration="7.26651646s" podCreationTimestamp="2026-03-11 12:20:32 +0000 UTC" firstStartedPulling="2026-03-11 12:20:34.165100813 +0000 UTC m=+1320.756364780" lastFinishedPulling="2026-03-11 12:20:37.611735157 +0000 UTC m=+1324.202999114" observedRunningTime="2026-03-11 12:20:39.255949728 +0000 UTC m=+1325.847213695" watchObservedRunningTime="2026-03-11 12:20:39.26651646 +0000 UTC m=+1325.857780427" Mar 11 12:20:39 crc kubenswrapper[4816]: I0311 12:20:39.308116 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.101696878 podStartE2EDuration="7.308091478s" podCreationTimestamp="2026-03-11 12:20:32 +0000 UTC" firstStartedPulling="2026-03-11 12:20:34.405351087 +0000 UTC m=+1320.996615054" lastFinishedPulling="2026-03-11 12:20:37.611745687 +0000 UTC m=+1324.203009654" observedRunningTime="2026-03-11 12:20:39.300432139 +0000 UTC m=+1325.891696106" watchObservedRunningTime="2026-03-11 12:20:39.308091478 +0000 UTC m=+1325.899355445" Mar 11 12:20:40 crc kubenswrapper[4816]: I0311 12:20:40.239489 4816 generic.go:334] "Generic (PLEG): container finished" podID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerID="fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af" exitCode=143 Mar 11 12:20:40 crc kubenswrapper[4816]: I0311 12:20:40.239569 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1966e6cd-d10e-468d-9e4f-7484f67202b4","Type":"ContainerDied","Data":"fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af"} Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.177721 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.253017 4816 generic.go:334] "Generic (PLEG): container finished" podID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerID="23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e" exitCode=0 Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.253076 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1966e6cd-d10e-468d-9e4f-7484f67202b4","Type":"ContainerDied","Data":"23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e"} Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.253112 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1966e6cd-d10e-468d-9e4f-7484f67202b4","Type":"ContainerDied","Data":"8b2de9f5a79740ef68c68b7f57f77e1e4a7c7e1b2cf9e2d47a43242ce1a5d655"} Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.253133 4816 scope.go:117] "RemoveContainer" containerID="23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.254193 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.291404 4816 scope.go:117] "RemoveContainer" containerID="fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.317104 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1966e6cd-d10e-468d-9e4f-7484f67202b4-logs\") pod \"1966e6cd-d10e-468d-9e4f-7484f67202b4\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.317344 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsrdn\" (UniqueName: \"kubernetes.io/projected/1966e6cd-d10e-468d-9e4f-7484f67202b4-kube-api-access-nsrdn\") pod \"1966e6cd-d10e-468d-9e4f-7484f67202b4\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.317453 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-config-data\") pod \"1966e6cd-d10e-468d-9e4f-7484f67202b4\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.317510 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-combined-ca-bundle\") pod \"1966e6cd-d10e-468d-9e4f-7484f67202b4\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.318187 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1966e6cd-d10e-468d-9e4f-7484f67202b4-logs" (OuterVolumeSpecName: "logs") pod "1966e6cd-d10e-468d-9e4f-7484f67202b4" (UID: "1966e6cd-d10e-468d-9e4f-7484f67202b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.318413 4816 scope.go:117] "RemoveContainer" containerID="23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e" Mar 11 12:20:41 crc kubenswrapper[4816]: E0311 12:20:41.319182 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e\": container with ID starting with 23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e not found: ID does not exist" containerID="23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.319223 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e"} err="failed to get container status \"23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e\": rpc error: code = NotFound desc = could not find container \"23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e\": container with ID starting with 23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e not found: ID does not exist" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.319262 4816 scope.go:117] "RemoveContainer" containerID="fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af" Mar 11 12:20:41 crc kubenswrapper[4816]: E0311 12:20:41.319721 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af\": container with ID starting with fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af not found: ID does not exist" containerID="fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.319788 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af"} err="failed to get container status \"fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af\": rpc error: code = NotFound desc = could not find container \"fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af\": container with ID starting with fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af not found: ID does not exist" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.333401 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1966e6cd-d10e-468d-9e4f-7484f67202b4-kube-api-access-nsrdn" (OuterVolumeSpecName: "kube-api-access-nsrdn") pod "1966e6cd-d10e-468d-9e4f-7484f67202b4" (UID: "1966e6cd-d10e-468d-9e4f-7484f67202b4"). InnerVolumeSpecName "kube-api-access-nsrdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.355974 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-config-data" (OuterVolumeSpecName: "config-data") pod "1966e6cd-d10e-468d-9e4f-7484f67202b4" (UID: "1966e6cd-d10e-468d-9e4f-7484f67202b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.365972 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1966e6cd-d10e-468d-9e4f-7484f67202b4" (UID: "1966e6cd-d10e-468d-9e4f-7484f67202b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.421228 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1966e6cd-d10e-468d-9e4f-7484f67202b4-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.421294 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsrdn\" (UniqueName: \"kubernetes.io/projected/1966e6cd-d10e-468d-9e4f-7484f67202b4-kube-api-access-nsrdn\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.421308 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.421321 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.605873 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.624482 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.644917 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:41 crc kubenswrapper[4816]: E0311 12:20:41.645620 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-metadata" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.645656 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-metadata" Mar 11 12:20:41 crc kubenswrapper[4816]: E0311 12:20:41.646182 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-log" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.646205 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-log" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.646529 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-log" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.646564 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-metadata" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.648427 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.652828 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.653185 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.674663 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.727926 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29wd6\" (UniqueName: \"kubernetes.io/projected/421dad23-2283-4534-b064-250972bc1863-kube-api-access-29wd6\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.727988 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/421dad23-2283-4534-b064-250972bc1863-logs\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.728153 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.728197 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-config-data\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.728221 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.830717 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.830798 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-config-data\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.830822 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.830890 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29wd6\" (UniqueName: \"kubernetes.io/projected/421dad23-2283-4534-b064-250972bc1863-kube-api-access-29wd6\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.830910 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/421dad23-2283-4534-b064-250972bc1863-logs\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.831669 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/421dad23-2283-4534-b064-250972bc1863-logs\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.837538 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.842635 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-config-data\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.849229 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.852201 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29wd6\" (UniqueName: \"kubernetes.io/projected/421dad23-2283-4534-b064-250972bc1863-kube-api-access-29wd6\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.983134 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:42 crc kubenswrapper[4816]: I0311 12:20:42.156404 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" path="/var/lib/kubelet/pods/1966e6cd-d10e-468d-9e4f-7484f67202b4/volumes" Mar 11 12:20:42 crc kubenswrapper[4816]: I0311 12:20:42.534099 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:42 crc kubenswrapper[4816]: I0311 12:20:42.811061 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.091962 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.092065 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.258841 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.278491 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.279680 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"421dad23-2283-4534-b064-250972bc1863","Type":"ContainerStarted","Data":"9a07b967847536eab2a5c61594718aad5b432c59b70b5d223ca12c9d44afd618"} Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.279757 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"421dad23-2283-4534-b064-250972bc1863","Type":"ContainerStarted","Data":"7df3059467c64aeccef2f0dfb8f6972acb1a50b5456b5dc1259a2eb0aaddb718"} Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.279777 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"421dad23-2283-4534-b064-250972bc1863","Type":"ContainerStarted","Data":"9fb01dffc484a29c979e5dd44b3227a5cfb654c600c8c00be0f28ed629855af7"} Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.284298 4816 generic.go:334] "Generic (PLEG): container finished" podID="fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" containerID="393812c2bcecafdfa2f0e8dd848197ee6392877cb2e73b5a2b5b12d642a2ed5c" exitCode=0 Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.284355 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wdblc" event={"ID":"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac","Type":"ContainerDied","Data":"393812c2bcecafdfa2f0e8dd848197ee6392877cb2e73b5a2b5b12d642a2ed5c"} Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.293450 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.346912 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.367959 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.367933523 podStartE2EDuration="2.367933523s" podCreationTimestamp="2026-03-11 12:20:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:43.325302775 +0000 UTC m=+1329.916566742" watchObservedRunningTime="2026-03-11 12:20:43.367933523 +0000 UTC m=+1329.959197490" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.391411 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-7gcck"] Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.391701 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" podUID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerName="dnsmasq-dns" containerID="cri-o://2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2" gracePeriod=10 Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.847119 4816 scope.go:117] "RemoveContainer" containerID="37568547e2b255f52263c2130857ff28c18773cdb28a0d8fb13178ff2dc5ab7f" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.984263 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.098833 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-svc\") pod \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.098901 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-nb\") pod \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.099110 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-config\") pod \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.099190 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-sb\") pod \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.099225 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-swift-storage-0\") pod \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.099341 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwh8q\" (UniqueName: \"kubernetes.io/projected/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-kube-api-access-kwh8q\") pod \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.110540 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-kube-api-access-kwh8q" (OuterVolumeSpecName: "kube-api-access-kwh8q") pod "1f7f295b-c30d-49a7-b5fa-b1ae8f705589" (UID: "1f7f295b-c30d-49a7-b5fa-b1ae8f705589"). InnerVolumeSpecName "kube-api-access-kwh8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.139819 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.142141 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.184667 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f7f295b-c30d-49a7-b5fa-b1ae8f705589" (UID: "1f7f295b-c30d-49a7-b5fa-b1ae8f705589"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.204776 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.204815 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwh8q\" (UniqueName: \"kubernetes.io/projected/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-kube-api-access-kwh8q\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.209996 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f7f295b-c30d-49a7-b5fa-b1ae8f705589" (UID: "1f7f295b-c30d-49a7-b5fa-b1ae8f705589"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.210266 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1f7f295b-c30d-49a7-b5fa-b1ae8f705589" (UID: "1f7f295b-c30d-49a7-b5fa-b1ae8f705589"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.213665 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-config" (OuterVolumeSpecName: "config") pod "1f7f295b-c30d-49a7-b5fa-b1ae8f705589" (UID: "1f7f295b-c30d-49a7-b5fa-b1ae8f705589"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.241007 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f7f295b-c30d-49a7-b5fa-b1ae8f705589" (UID: "1f7f295b-c30d-49a7-b5fa-b1ae8f705589"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.307430 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.307477 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.307490 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.307501 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.309401 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerID="2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2" exitCode=0 Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.309472 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.309540 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" event={"ID":"1f7f295b-c30d-49a7-b5fa-b1ae8f705589","Type":"ContainerDied","Data":"2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2"} Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.309574 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" event={"ID":"1f7f295b-c30d-49a7-b5fa-b1ae8f705589","Type":"ContainerDied","Data":"22b1daa75682bd6ac40d3753e3d1220fc2183f782012b1a65bc963f4cb8ba7ec"} Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.309598 4816 scope.go:117] "RemoveContainer" containerID="2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.355049 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-7gcck"] Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.357269 4816 scope.go:117] "RemoveContainer" containerID="93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.364496 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-7gcck"] Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.399412 4816 scope.go:117] "RemoveContainer" containerID="2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2" Mar 11 12:20:44 crc kubenswrapper[4816]: E0311 12:20:44.400098 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2\": container with ID starting with 2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2 not found: ID does not exist" containerID="2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.400150 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2"} err="failed to get container status \"2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2\": rpc error: code = NotFound desc = could not find container \"2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2\": container with ID starting with 2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2 not found: ID does not exist" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.400187 4816 scope.go:117] "RemoveContainer" containerID="93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603" Mar 11 12:20:44 crc kubenswrapper[4816]: E0311 12:20:44.400614 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603\": container with ID starting with 93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603 not found: ID does not exist" containerID="93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.400649 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603"} err="failed to get container status \"93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603\": rpc error: code = NotFound desc = could not find container \"93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603\": container with ID starting with 93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603 not found: ID does not exist" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.644342 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.816909 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-config-data\") pod \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.817537 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-combined-ca-bundle\") pod \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.817619 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66txc\" (UniqueName: \"kubernetes.io/projected/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-kube-api-access-66txc\") pod \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.818445 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-scripts\") pod \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.824620 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-kube-api-access-66txc" (OuterVolumeSpecName: "kube-api-access-66txc") pod "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" (UID: "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac"). InnerVolumeSpecName "kube-api-access-66txc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.825518 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-scripts" (OuterVolumeSpecName: "scripts") pod "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" (UID: "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.852217 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-config-data" (OuterVolumeSpecName: "config-data") pod "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" (UID: "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.856931 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" (UID: "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.920549 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.920583 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.920597 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66txc\" (UniqueName: \"kubernetes.io/projected/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-kube-api-access-66txc\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.920606 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.322113 4816 generic.go:334] "Generic (PLEG): container finished" podID="3f519dc2-e88b-4e4b-9637-c3e172b81bfa" containerID="7499f2a9acd657b210b2f77e2cefe97fa749ba96e868296d76960eaa9ed38ee8" exitCode=0 Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.322213 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qt9tz" event={"ID":"3f519dc2-e88b-4e4b-9637-c3e172b81bfa","Type":"ContainerDied","Data":"7499f2a9acd657b210b2f77e2cefe97fa749ba96e868296d76960eaa9ed38ee8"} Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.325688 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wdblc" event={"ID":"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac","Type":"ContainerDied","Data":"f816942d05a050c01aa9ba2c41f5b875b2c28a8b75fac93c9319285524d0649e"} Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.325748 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f816942d05a050c01aa9ba2c41f5b875b2c28a8b75fac93c9319285524d0649e" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.325790 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.465490 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 12:20:45 crc kubenswrapper[4816]: E0311 12:20:45.466010 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerName="dnsmasq-dns" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.466026 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerName="dnsmasq-dns" Mar 11 12:20:45 crc kubenswrapper[4816]: E0311 12:20:45.466046 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerName="init" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.466052 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerName="init" Mar 11 12:20:45 crc kubenswrapper[4816]: E0311 12:20:45.466059 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" containerName="nova-cell1-conductor-db-sync" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.466066 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" containerName="nova-cell1-conductor-db-sync" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.466312 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerName="dnsmasq-dns" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.466343 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" containerName="nova-cell1-conductor-db-sync" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.467472 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.470671 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.480489 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.636973 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.637087 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.637177 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcvt9\" (UniqueName: \"kubernetes.io/projected/63567eba-cc2a-4168-9e81-51c1daed5482-kube-api-access-jcvt9\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.740041 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.740106 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.740143 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcvt9\" (UniqueName: \"kubernetes.io/projected/63567eba-cc2a-4168-9e81-51c1daed5482-kube-api-access-jcvt9\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.744688 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.758440 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.759918 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcvt9\" (UniqueName: \"kubernetes.io/projected/63567eba-cc2a-4168-9e81-51c1daed5482-kube-api-access-jcvt9\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.801998 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.146352 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" path="/var/lib/kubelet/pods/1f7f295b-c30d-49a7-b5fa-b1ae8f705589/volumes" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.294756 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 12:20:46 crc kubenswrapper[4816]: W0311 12:20:46.298569 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63567eba_cc2a_4168_9e81_51c1daed5482.slice/crio-188caecd38c19e3561f318ca76a8032bcaad31be23f5090529c90fb8dfd7f7e7 WatchSource:0}: Error finding container 188caecd38c19e3561f318ca76a8032bcaad31be23f5090529c90fb8dfd7f7e7: Status 404 returned error can't find the container with id 188caecd38c19e3561f318ca76a8032bcaad31be23f5090529c90fb8dfd7f7e7 Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.336729 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"63567eba-cc2a-4168-9e81-51c1daed5482","Type":"ContainerStarted","Data":"188caecd38c19e3561f318ca76a8032bcaad31be23f5090529c90fb8dfd7f7e7"} Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.746279 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.869694 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-combined-ca-bundle\") pod \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.869761 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs7hr\" (UniqueName: \"kubernetes.io/projected/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-kube-api-access-fs7hr\") pod \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.869815 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-config-data\") pod \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.869991 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-scripts\") pod \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.875589 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-kube-api-access-fs7hr" (OuterVolumeSpecName: "kube-api-access-fs7hr") pod "3f519dc2-e88b-4e4b-9637-c3e172b81bfa" (UID: "3f519dc2-e88b-4e4b-9637-c3e172b81bfa"). InnerVolumeSpecName "kube-api-access-fs7hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.875881 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-scripts" (OuterVolumeSpecName: "scripts") pod "3f519dc2-e88b-4e4b-9637-c3e172b81bfa" (UID: "3f519dc2-e88b-4e4b-9637-c3e172b81bfa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.903949 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f519dc2-e88b-4e4b-9637-c3e172b81bfa" (UID: "3f519dc2-e88b-4e4b-9637-c3e172b81bfa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.914110 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-config-data" (OuterVolumeSpecName: "config-data") pod "3f519dc2-e88b-4e4b-9637-c3e172b81bfa" (UID: "3f519dc2-e88b-4e4b-9637-c3e172b81bfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.971958 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.972001 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.972009 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.972022 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs7hr\" (UniqueName: \"kubernetes.io/projected/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-kube-api-access-fs7hr\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.983862 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.983934 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.348060 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"63567eba-cc2a-4168-9e81-51c1daed5482","Type":"ContainerStarted","Data":"adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6"} Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.349860 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.352394 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qt9tz" event={"ID":"3f519dc2-e88b-4e4b-9637-c3e172b81bfa","Type":"ContainerDied","Data":"f5b3dcc0252b5364b53dcccbdc300f4832e72947bf82b5958b6e65ae5b0eac60"} Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.352443 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5b3dcc0252b5364b53dcccbdc300f4832e72947bf82b5958b6e65ae5b0eac60" Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.352510 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.385524 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.385492961 podStartE2EDuration="2.385492961s" podCreationTimestamp="2026-03-11 12:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:47.372804218 +0000 UTC m=+1333.964068185" watchObservedRunningTime="2026-03-11 12:20:47.385492961 +0000 UTC m=+1333.976756928" Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.528108 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.528480 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-log" containerID="cri-o://03715e3af42eb6ce517cc3fbf23a4e260b58184bdafa92056416df870db5e907" gracePeriod=30 Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.528627 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-api" containerID="cri-o://40e1356e9811fbd25787534e33a830de5168253a4c4ac07e43758ab056b21a5a" gracePeriod=30 Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.538172 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.538486 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="05937283-8ec7-430d-be71-c968e8e97ff1" containerName="nova-scheduler-scheduler" containerID="cri-o://e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42" gracePeriod=30 Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.559102 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.559612 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-log" containerID="cri-o://7df3059467c64aeccef2f0dfb8f6972acb1a50b5456b5dc1259a2eb0aaddb718" gracePeriod=30 Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.559728 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-metadata" containerID="cri-o://9a07b967847536eab2a5c61594718aad5b432c59b70b5d223ca12c9d44afd618" gracePeriod=30 Mar 11 12:20:48 crc kubenswrapper[4816]: E0311 12:20:48.261643 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:20:48 crc kubenswrapper[4816]: E0311 12:20:48.267163 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:20:48 crc kubenswrapper[4816]: E0311 12:20:48.268904 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:20:48 crc kubenswrapper[4816]: E0311 12:20:48.268948 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="05937283-8ec7-430d-be71-c968e8e97ff1" containerName="nova-scheduler-scheduler" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.380880 4816 generic.go:334] "Generic (PLEG): container finished" podID="421dad23-2283-4534-b064-250972bc1863" containerID="9a07b967847536eab2a5c61594718aad5b432c59b70b5d223ca12c9d44afd618" exitCode=0 Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.380935 4816 generic.go:334] "Generic (PLEG): container finished" podID="421dad23-2283-4534-b064-250972bc1863" containerID="7df3059467c64aeccef2f0dfb8f6972acb1a50b5456b5dc1259a2eb0aaddb718" exitCode=143 Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.381007 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"421dad23-2283-4534-b064-250972bc1863","Type":"ContainerDied","Data":"9a07b967847536eab2a5c61594718aad5b432c59b70b5d223ca12c9d44afd618"} Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.381097 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"421dad23-2283-4534-b064-250972bc1863","Type":"ContainerDied","Data":"7df3059467c64aeccef2f0dfb8f6972acb1a50b5456b5dc1259a2eb0aaddb718"} Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.384125 4816 generic.go:334] "Generic (PLEG): container finished" podID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerID="03715e3af42eb6ce517cc3fbf23a4e260b58184bdafa92056416df870db5e907" exitCode=143 Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.384214 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78b435e0-53bf-4f8c-aef9-49b170fc9519","Type":"ContainerDied","Data":"03715e3af42eb6ce517cc3fbf23a4e260b58184bdafa92056416df870db5e907"} Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.514414 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.607735 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-config-data\") pod \"421dad23-2283-4534-b064-250972bc1863\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.608380 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-nova-metadata-tls-certs\") pod \"421dad23-2283-4534-b064-250972bc1863\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.608613 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/421dad23-2283-4534-b064-250972bc1863-logs\") pod \"421dad23-2283-4534-b064-250972bc1863\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.608830 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29wd6\" (UniqueName: \"kubernetes.io/projected/421dad23-2283-4534-b064-250972bc1863-kube-api-access-29wd6\") pod \"421dad23-2283-4534-b064-250972bc1863\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.609003 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-combined-ca-bundle\") pod \"421dad23-2283-4534-b064-250972bc1863\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.609048 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/421dad23-2283-4534-b064-250972bc1863-logs" (OuterVolumeSpecName: "logs") pod "421dad23-2283-4534-b064-250972bc1863" (UID: "421dad23-2283-4534-b064-250972bc1863"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.609858 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/421dad23-2283-4534-b064-250972bc1863-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.617019 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421dad23-2283-4534-b064-250972bc1863-kube-api-access-29wd6" (OuterVolumeSpecName: "kube-api-access-29wd6") pod "421dad23-2283-4534-b064-250972bc1863" (UID: "421dad23-2283-4534-b064-250972bc1863"). InnerVolumeSpecName "kube-api-access-29wd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.637523 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-config-data" (OuterVolumeSpecName: "config-data") pod "421dad23-2283-4534-b064-250972bc1863" (UID: "421dad23-2283-4534-b064-250972bc1863"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.639059 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "421dad23-2283-4534-b064-250972bc1863" (UID: "421dad23-2283-4534-b064-250972bc1863"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.676368 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "421dad23-2283-4534-b064-250972bc1863" (UID: "421dad23-2283-4534-b064-250972bc1863"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.711594 4816 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.711696 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29wd6\" (UniqueName: \"kubernetes.io/projected/421dad23-2283-4534-b064-250972bc1863-kube-api-access-29wd6\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.711710 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.711721 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.395837 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"421dad23-2283-4534-b064-250972bc1863","Type":"ContainerDied","Data":"9fb01dffc484a29c979e5dd44b3227a5cfb654c600c8c00be0f28ed629855af7"} Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.395905 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.395931 4816 scope.go:117] "RemoveContainer" containerID="9a07b967847536eab2a5c61594718aad5b432c59b70b5d223ca12c9d44afd618" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.431380 4816 scope.go:117] "RemoveContainer" containerID="7df3059467c64aeccef2f0dfb8f6972acb1a50b5456b5dc1259a2eb0aaddb718" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.443908 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.455705 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.465597 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:49 crc kubenswrapper[4816]: E0311 12:20:49.466667 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-log" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.466687 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-log" Mar 11 12:20:49 crc kubenswrapper[4816]: E0311 12:20:49.466702 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f519dc2-e88b-4e4b-9637-c3e172b81bfa" containerName="nova-manage" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.466710 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f519dc2-e88b-4e4b-9637-c3e172b81bfa" containerName="nova-manage" Mar 11 12:20:49 crc kubenswrapper[4816]: E0311 12:20:49.466728 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-metadata" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.466736 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-metadata" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.466951 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-metadata" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.466972 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f519dc2-e88b-4e4b-9637-c3e172b81bfa" containerName="nova-manage" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.466983 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-log" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.468335 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.471601 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.471878 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.485834 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.560728 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.634133 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.634363 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.634408 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8751c6-ef60-400a-b4e3-0042d63c2d83-logs\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.634433 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzfsx\" (UniqueName: \"kubernetes.io/projected/3b8751c6-ef60-400a-b4e3-0042d63c2d83-kube-api-access-bzfsx\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.634632 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-config-data\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.737481 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzfsx\" (UniqueName: \"kubernetes.io/projected/3b8751c6-ef60-400a-b4e3-0042d63c2d83-kube-api-access-bzfsx\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.737607 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-config-data\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.737755 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.737863 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.737915 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8751c6-ef60-400a-b4e3-0042d63c2d83-logs\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.738513 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8751c6-ef60-400a-b4e3-0042d63c2d83-logs\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.746642 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.747450 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-config-data\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.748185 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.762424 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzfsx\" (UniqueName: \"kubernetes.io/projected/3b8751c6-ef60-400a-b4e3-0042d63c2d83-kube-api-access-bzfsx\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.798152 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:50 crc kubenswrapper[4816]: I0311 12:20:50.155376 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="421dad23-2283-4534-b064-250972bc1863" path="/var/lib/kubelet/pods/421dad23-2283-4534-b064-250972bc1863/volumes" Mar 11 12:20:50 crc kubenswrapper[4816]: I0311 12:20:50.379750 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:50 crc kubenswrapper[4816]: I0311 12:20:50.441323 4816 generic.go:334] "Generic (PLEG): container finished" podID="05937283-8ec7-430d-be71-c968e8e97ff1" containerID="e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42" exitCode=0 Mar 11 12:20:50 crc kubenswrapper[4816]: I0311 12:20:50.442403 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05937283-8ec7-430d-be71-c968e8e97ff1","Type":"ContainerDied","Data":"e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42"} Mar 11 12:20:50 crc kubenswrapper[4816]: I0311 12:20:50.895284 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.070369 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-combined-ca-bundle\") pod \"05937283-8ec7-430d-be71-c968e8e97ff1\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.070464 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fdrt\" (UniqueName: \"kubernetes.io/projected/05937283-8ec7-430d-be71-c968e8e97ff1-kube-api-access-7fdrt\") pod \"05937283-8ec7-430d-be71-c968e8e97ff1\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.070512 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-config-data\") pod \"05937283-8ec7-430d-be71-c968e8e97ff1\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.078131 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05937283-8ec7-430d-be71-c968e8e97ff1-kube-api-access-7fdrt" (OuterVolumeSpecName: "kube-api-access-7fdrt") pod "05937283-8ec7-430d-be71-c968e8e97ff1" (UID: "05937283-8ec7-430d-be71-c968e8e97ff1"). InnerVolumeSpecName "kube-api-access-7fdrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.118186 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05937283-8ec7-430d-be71-c968e8e97ff1" (UID: "05937283-8ec7-430d-be71-c968e8e97ff1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.125155 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-config-data" (OuterVolumeSpecName: "config-data") pod "05937283-8ec7-430d-be71-c968e8e97ff1" (UID: "05937283-8ec7-430d-be71-c968e8e97ff1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.179374 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fdrt\" (UniqueName: \"kubernetes.io/projected/05937283-8ec7-430d-be71-c968e8e97ff1-kube-api-access-7fdrt\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.179431 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.179446 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.458393 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b8751c6-ef60-400a-b4e3-0042d63c2d83","Type":"ContainerStarted","Data":"65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a"} Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.458473 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b8751c6-ef60-400a-b4e3-0042d63c2d83","Type":"ContainerStarted","Data":"8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097"} Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.458495 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b8751c6-ef60-400a-b4e3-0042d63c2d83","Type":"ContainerStarted","Data":"3d6f4a92fab1ae4820eecc5176239bbe544418957d5b8b49929c39dc6ee8800c"} Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.464925 4816 generic.go:334] "Generic (PLEG): container finished" podID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerID="40e1356e9811fbd25787534e33a830de5168253a4c4ac07e43758ab056b21a5a" exitCode=0 Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.465024 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78b435e0-53bf-4f8c-aef9-49b170fc9519","Type":"ContainerDied","Data":"40e1356e9811fbd25787534e33a830de5168253a4c4ac07e43758ab056b21a5a"} Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.466499 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05937283-8ec7-430d-be71-c968e8e97ff1","Type":"ContainerDied","Data":"20c9b17fece99c2795c0705a485bb232956e3243ab02757479fd7900fae5c7a8"} Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.466546 4816 scope.go:117] "RemoveContainer" containerID="e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.466763 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.497297 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.509341 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.509308814 podStartE2EDuration="2.509308814s" podCreationTimestamp="2026-03-11 12:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:51.490405234 +0000 UTC m=+1338.081669201" watchObservedRunningTime="2026-03-11 12:20:51.509308814 +0000 UTC m=+1338.100572781" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.543126 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.552886 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.564272 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:51 crc kubenswrapper[4816]: E0311 12:20:51.564871 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05937283-8ec7-430d-be71-c968e8e97ff1" containerName="nova-scheduler-scheduler" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.564888 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="05937283-8ec7-430d-be71-c968e8e97ff1" containerName="nova-scheduler-scheduler" Mar 11 12:20:51 crc kubenswrapper[4816]: E0311 12:20:51.564935 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-api" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.564941 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-api" Mar 11 12:20:51 crc kubenswrapper[4816]: E0311 12:20:51.564953 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-log" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.564959 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-log" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.565164 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-log" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.565188 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="05937283-8ec7-430d-be71-c968e8e97ff1" containerName="nova-scheduler-scheduler" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.565207 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-api" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.566035 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.568420 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.585059 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.593401 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlcbd\" (UniqueName: \"kubernetes.io/projected/78b435e0-53bf-4f8c-aef9-49b170fc9519-kube-api-access-wlcbd\") pod \"78b435e0-53bf-4f8c-aef9-49b170fc9519\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.593486 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-combined-ca-bundle\") pod \"78b435e0-53bf-4f8c-aef9-49b170fc9519\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.593565 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-config-data\") pod \"78b435e0-53bf-4f8c-aef9-49b170fc9519\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.593661 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b435e0-53bf-4f8c-aef9-49b170fc9519-logs\") pod \"78b435e0-53bf-4f8c-aef9-49b170fc9519\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.598749 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b435e0-53bf-4f8c-aef9-49b170fc9519-logs" (OuterVolumeSpecName: "logs") pod "78b435e0-53bf-4f8c-aef9-49b170fc9519" (UID: "78b435e0-53bf-4f8c-aef9-49b170fc9519"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.599601 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b435e0-53bf-4f8c-aef9-49b170fc9519-kube-api-access-wlcbd" (OuterVolumeSpecName: "kube-api-access-wlcbd") pod "78b435e0-53bf-4f8c-aef9-49b170fc9519" (UID: "78b435e0-53bf-4f8c-aef9-49b170fc9519"). InnerVolumeSpecName "kube-api-access-wlcbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.623813 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78b435e0-53bf-4f8c-aef9-49b170fc9519" (UID: "78b435e0-53bf-4f8c-aef9-49b170fc9519"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.623871 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-config-data" (OuterVolumeSpecName: "config-data") pod "78b435e0-53bf-4f8c-aef9-49b170fc9519" (UID: "78b435e0-53bf-4f8c-aef9-49b170fc9519"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.699700 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-config-data\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.700267 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84md9\" (UniqueName: \"kubernetes.io/projected/49c3f447-334e-4147-b877-22a0ce6e3345-kube-api-access-84md9\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.700379 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.700592 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b435e0-53bf-4f8c-aef9-49b170fc9519-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.700618 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlcbd\" (UniqueName: \"kubernetes.io/projected/78b435e0-53bf-4f8c-aef9-49b170fc9519-kube-api-access-wlcbd\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.700634 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.700648 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.802552 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-config-data\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.803188 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84md9\" (UniqueName: \"kubernetes.io/projected/49c3f447-334e-4147-b877-22a0ce6e3345-kube-api-access-84md9\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.803221 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.808606 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-config-data\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.809097 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.830817 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84md9\" (UniqueName: \"kubernetes.io/projected/49c3f447-334e-4147-b877-22a0ce6e3345-kube-api-access-84md9\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.886540 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.145687 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05937283-8ec7-430d-be71-c968e8e97ff1" path="/var/lib/kubelet/pods/05937283-8ec7-430d-be71-c968e8e97ff1/volumes" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.364144 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.488475 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78b435e0-53bf-4f8c-aef9-49b170fc9519","Type":"ContainerDied","Data":"0ff4ead2a33e6228002ecd5e8665db969fa8aef2732966d20f7187976e9cf4b6"} Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.488555 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.488596 4816 scope.go:117] "RemoveContainer" containerID="40e1356e9811fbd25787534e33a830de5168253a4c4ac07e43758ab056b21a5a" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.498134 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49c3f447-334e-4147-b877-22a0ce6e3345","Type":"ContainerStarted","Data":"80c23e1f7785724059b85c847854192b3471a718a42ed80849445c1edfb1f7c4"} Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.532100 4816 scope.go:117] "RemoveContainer" containerID="03715e3af42eb6ce517cc3fbf23a4e260b58184bdafa92056416df870db5e907" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.532854 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.550009 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.555964 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.557845 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.561821 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.585993 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.729555 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-config-data\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.729649 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42kkf\" (UniqueName: \"kubernetes.io/projected/8b44498c-88b3-42e4-b8cd-322579c29a3e-kube-api-access-42kkf\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.729674 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.729833 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b44498c-88b3-42e4-b8cd-322579c29a3e-logs\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.832031 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b44498c-88b3-42e4-b8cd-322579c29a3e-logs\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.832459 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-config-data\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.832803 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.832883 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42kkf\" (UniqueName: \"kubernetes.io/projected/8b44498c-88b3-42e4-b8cd-322579c29a3e-kube-api-access-42kkf\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.833067 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b44498c-88b3-42e4-b8cd-322579c29a3e-logs\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.841138 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.844962 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-config-data\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.855683 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42kkf\" (UniqueName: \"kubernetes.io/projected/8b44498c-88b3-42e4-b8cd-322579c29a3e-kube-api-access-42kkf\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.918445 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:20:53 crc kubenswrapper[4816]: I0311 12:20:53.420927 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:53 crc kubenswrapper[4816]: I0311 12:20:53.513108 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49c3f447-334e-4147-b877-22a0ce6e3345","Type":"ContainerStarted","Data":"4ce5b26a7642dbb3c4b2d4c21f23040b0afe51a33212a23837a87602f659ac7d"} Mar 11 12:20:53 crc kubenswrapper[4816]: I0311 12:20:53.521095 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b44498c-88b3-42e4-b8cd-322579c29a3e","Type":"ContainerStarted","Data":"3161af8bf1555f75f7e3fe8c5b6c7028f30e608b5088c5375d09c6a61566d4c9"} Mar 11 12:20:53 crc kubenswrapper[4816]: I0311 12:20:53.542720 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.542692005 podStartE2EDuration="2.542692005s" podCreationTimestamp="2026-03-11 12:20:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:53.538684181 +0000 UTC m=+1340.129948148" watchObservedRunningTime="2026-03-11 12:20:53.542692005 +0000 UTC m=+1340.133955972" Mar 11 12:20:53 crc kubenswrapper[4816]: I0311 12:20:53.835223 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:20:53 crc kubenswrapper[4816]: I0311 12:20:53.835993 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8e9e4e8b-b60c-4c37-974a-8bdc1b243135" containerName="kube-state-metrics" containerID="cri-o://679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b" gracePeriod=30 Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.146566 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" path="/var/lib/kubelet/pods/78b435e0-53bf-4f8c-aef9-49b170fc9519/volumes" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.345608 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.481133 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbflg\" (UniqueName: \"kubernetes.io/projected/8e9e4e8b-b60c-4c37-974a-8bdc1b243135-kube-api-access-cbflg\") pod \"8e9e4e8b-b60c-4c37-974a-8bdc1b243135\" (UID: \"8e9e4e8b-b60c-4c37-974a-8bdc1b243135\") " Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.488697 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e9e4e8b-b60c-4c37-974a-8bdc1b243135-kube-api-access-cbflg" (OuterVolumeSpecName: "kube-api-access-cbflg") pod "8e9e4e8b-b60c-4c37-974a-8bdc1b243135" (UID: "8e9e4e8b-b60c-4c37-974a-8bdc1b243135"). InnerVolumeSpecName "kube-api-access-cbflg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.586209 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbflg\" (UniqueName: \"kubernetes.io/projected/8e9e4e8b-b60c-4c37-974a-8bdc1b243135-kube-api-access-cbflg\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.588995 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b44498c-88b3-42e4-b8cd-322579c29a3e","Type":"ContainerStarted","Data":"e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a"} Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.589051 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b44498c-88b3-42e4-b8cd-322579c29a3e","Type":"ContainerStarted","Data":"ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9"} Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.596986 4816 generic.go:334] "Generic (PLEG): container finished" podID="8e9e4e8b-b60c-4c37-974a-8bdc1b243135" containerID="679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b" exitCode=2 Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.597103 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.597154 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8e9e4e8b-b60c-4c37-974a-8bdc1b243135","Type":"ContainerDied","Data":"679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b"} Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.597187 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8e9e4e8b-b60c-4c37-974a-8bdc1b243135","Type":"ContainerDied","Data":"e914685ae7eb058c653bc79edb98cb710a39f5ce6911740300b8ce8933b04af8"} Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.597208 4816 scope.go:117] "RemoveContainer" containerID="679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.616170 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.616149469 podStartE2EDuration="2.616149469s" podCreationTimestamp="2026-03-11 12:20:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:54.608922742 +0000 UTC m=+1341.200186709" watchObservedRunningTime="2026-03-11 12:20:54.616149469 +0000 UTC m=+1341.207413436" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.642682 4816 scope.go:117] "RemoveContainer" containerID="679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b" Mar 11 12:20:54 crc kubenswrapper[4816]: E0311 12:20:54.643177 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b\": container with ID starting with 679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b not found: ID does not exist" containerID="679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.643221 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b"} err="failed to get container status \"679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b\": rpc error: code = NotFound desc = could not find container \"679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b\": container with ID starting with 679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b not found: ID does not exist" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.651890 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.663456 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.678845 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:20:54 crc kubenswrapper[4816]: E0311 12:20:54.679593 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9e4e8b-b60c-4c37-974a-8bdc1b243135" containerName="kube-state-metrics" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.679623 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9e4e8b-b60c-4c37-974a-8bdc1b243135" containerName="kube-state-metrics" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.680005 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e9e4e8b-b60c-4c37-974a-8bdc1b243135" containerName="kube-state-metrics" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.681097 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.684141 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.684232 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.692357 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dlr4\" (UniqueName: \"kubernetes.io/projected/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-api-access-6dlr4\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.692514 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.692685 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.692920 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.705663 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.795678 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dlr4\" (UniqueName: \"kubernetes.io/projected/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-api-access-6dlr4\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.795776 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.795845 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.795951 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.799014 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.799416 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.801861 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.802402 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.814086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.821689 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dlr4\" (UniqueName: \"kubernetes.io/projected/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-api-access-6dlr4\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:55 crc kubenswrapper[4816]: I0311 12:20:55.008568 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:20:55 crc kubenswrapper[4816]: I0311 12:20:55.530749 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:20:55 crc kubenswrapper[4816]: I0311 12:20:55.607361 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"32dcc96b-186a-444d-bef3-4c5f117ee652","Type":"ContainerStarted","Data":"1e343e65b4d8cc4645e88fc1c1a55d93ec648ea21d55e4018feab7481fc909e7"} Mar 11 12:20:55 crc kubenswrapper[4816]: I0311 12:20:55.843121 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.046618 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.046962 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-central-agent" containerID="cri-o://3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b" gracePeriod=30 Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.048744 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="proxy-httpd" containerID="cri-o://46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c" gracePeriod=30 Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.051171 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-notification-agent" containerID="cri-o://cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74" gracePeriod=30 Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.051969 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="sg-core" containerID="cri-o://91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818" gracePeriod=30 Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.143565 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e9e4e8b-b60c-4c37-974a-8bdc1b243135" path="/var/lib/kubelet/pods/8e9e4e8b-b60c-4c37-974a-8bdc1b243135/volumes" Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.624394 4816 generic.go:334] "Generic (PLEG): container finished" podID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerID="46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c" exitCode=0 Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.624434 4816 generic.go:334] "Generic (PLEG): container finished" podID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerID="91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818" exitCode=2 Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.624444 4816 generic.go:334] "Generic (PLEG): container finished" podID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerID="3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b" exitCode=0 Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.624492 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerDied","Data":"46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c"} Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.624522 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerDied","Data":"91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818"} Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.624532 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerDied","Data":"3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b"} Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.626637 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"32dcc96b-186a-444d-bef3-4c5f117ee652","Type":"ContainerStarted","Data":"87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c"} Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.627504 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.664163 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.24142379 podStartE2EDuration="2.664132947s" podCreationTimestamp="2026-03-11 12:20:54 +0000 UTC" firstStartedPulling="2026-03-11 12:20:55.538625988 +0000 UTC m=+1342.129889945" lastFinishedPulling="2026-03-11 12:20:55.961335135 +0000 UTC m=+1342.552599102" observedRunningTime="2026-03-11 12:20:56.653500903 +0000 UTC m=+1343.244764880" watchObservedRunningTime="2026-03-11 12:20:56.664132947 +0000 UTC m=+1343.255396914" Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.886735 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.595411 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.657563 4816 generic.go:334] "Generic (PLEG): container finished" podID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerID="cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74" exitCode=0 Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.657635 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerDied","Data":"cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74"} Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.657679 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.657699 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerDied","Data":"38b868cc185bf2881b9763f9f27568b608cb3091bc38e885e64b2566d5c8d41e"} Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.657729 4816 scope.go:117] "RemoveContainer" containerID="46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.682593 4816 scope.go:117] "RemoveContainer" containerID="91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.694802 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-combined-ca-bundle\") pod \"1a9b124c-68d8-44e9-9381-fa448155ef23\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.694843 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8dwr\" (UniqueName: \"kubernetes.io/projected/1a9b124c-68d8-44e9-9381-fa448155ef23-kube-api-access-d8dwr\") pod \"1a9b124c-68d8-44e9-9381-fa448155ef23\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.694905 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-sg-core-conf-yaml\") pod \"1a9b124c-68d8-44e9-9381-fa448155ef23\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.695915 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-scripts\") pod \"1a9b124c-68d8-44e9-9381-fa448155ef23\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.696098 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-log-httpd\") pod \"1a9b124c-68d8-44e9-9381-fa448155ef23\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.696754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-run-httpd\") pod \"1a9b124c-68d8-44e9-9381-fa448155ef23\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.696598 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1a9b124c-68d8-44e9-9381-fa448155ef23" (UID: "1a9b124c-68d8-44e9-9381-fa448155ef23"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.696837 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-config-data\") pod \"1a9b124c-68d8-44e9-9381-fa448155ef23\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.697188 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1a9b124c-68d8-44e9-9381-fa448155ef23" (UID: "1a9b124c-68d8-44e9-9381-fa448155ef23"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.698046 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.698066 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.715575 4816 scope.go:117] "RemoveContainer" containerID="cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.715594 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-scripts" (OuterVolumeSpecName: "scripts") pod "1a9b124c-68d8-44e9-9381-fa448155ef23" (UID: "1a9b124c-68d8-44e9-9381-fa448155ef23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.715714 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a9b124c-68d8-44e9-9381-fa448155ef23-kube-api-access-d8dwr" (OuterVolumeSpecName: "kube-api-access-d8dwr") pod "1a9b124c-68d8-44e9-9381-fa448155ef23" (UID: "1a9b124c-68d8-44e9-9381-fa448155ef23"). InnerVolumeSpecName "kube-api-access-d8dwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.750739 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1a9b124c-68d8-44e9-9381-fa448155ef23" (UID: "1a9b124c-68d8-44e9-9381-fa448155ef23"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.800821 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8dwr\" (UniqueName: \"kubernetes.io/projected/1a9b124c-68d8-44e9-9381-fa448155ef23-kube-api-access-d8dwr\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.800882 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.800902 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.806651 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a9b124c-68d8-44e9-9381-fa448155ef23" (UID: "1a9b124c-68d8-44e9-9381-fa448155ef23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.835348 4816 scope.go:117] "RemoveContainer" containerID="3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.841527 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-config-data" (OuterVolumeSpecName: "config-data") pod "1a9b124c-68d8-44e9-9381-fa448155ef23" (UID: "1a9b124c-68d8-44e9-9381-fa448155ef23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.867979 4816 scope.go:117] "RemoveContainer" containerID="46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c" Mar 11 12:20:58 crc kubenswrapper[4816]: E0311 12:20:58.868821 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c\": container with ID starting with 46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c not found: ID does not exist" containerID="46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.868855 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c"} err="failed to get container status \"46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c\": rpc error: code = NotFound desc = could not find container \"46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c\": container with ID starting with 46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c not found: ID does not exist" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.868878 4816 scope.go:117] "RemoveContainer" containerID="91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818" Mar 11 12:20:58 crc kubenswrapper[4816]: E0311 12:20:58.869368 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818\": container with ID starting with 91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818 not found: ID does not exist" containerID="91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.869424 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818"} err="failed to get container status \"91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818\": rpc error: code = NotFound desc = could not find container \"91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818\": container with ID starting with 91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818 not found: ID does not exist" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.869470 4816 scope.go:117] "RemoveContainer" containerID="cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74" Mar 11 12:20:58 crc kubenswrapper[4816]: E0311 12:20:58.869831 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74\": container with ID starting with cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74 not found: ID does not exist" containerID="cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.869877 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74"} err="failed to get container status \"cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74\": rpc error: code = NotFound desc = could not find container \"cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74\": container with ID starting with cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74 not found: ID does not exist" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.869897 4816 scope.go:117] "RemoveContainer" containerID="3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b" Mar 11 12:20:58 crc kubenswrapper[4816]: E0311 12:20:58.870145 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b\": container with ID starting with 3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b not found: ID does not exist" containerID="3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.870176 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b"} err="failed to get container status \"3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b\": rpc error: code = NotFound desc = could not find container \"3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b\": container with ID starting with 3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b not found: ID does not exist" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.902986 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.903025 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.999785 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.019715 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.047894 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:59 crc kubenswrapper[4816]: E0311 12:20:59.048849 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-central-agent" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.048870 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-central-agent" Mar 11 12:20:59 crc kubenswrapper[4816]: E0311 12:20:59.048892 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-notification-agent" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.048901 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-notification-agent" Mar 11 12:20:59 crc kubenswrapper[4816]: E0311 12:20:59.048915 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="sg-core" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.048922 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="sg-core" Mar 11 12:20:59 crc kubenswrapper[4816]: E0311 12:20:59.048977 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="proxy-httpd" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.048986 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="proxy-httpd" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.049175 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-notification-agent" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.049197 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="proxy-httpd" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.049257 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="sg-core" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.049272 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-central-agent" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.054530 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.060101 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.060399 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.060533 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.075086 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.106982 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.107326 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-log-httpd\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.107547 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-run-httpd\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.107642 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x2sh\" (UniqueName: \"kubernetes.io/projected/940c2849-ce30-473c-9a55-b4fc35309bb7-kube-api-access-8x2sh\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.107714 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-config-data\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.107822 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.107915 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.108012 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-scripts\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.209281 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.209660 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-log-httpd\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.209788 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-run-httpd\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.209906 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x2sh\" (UniqueName: \"kubernetes.io/projected/940c2849-ce30-473c-9a55-b4fc35309bb7-kube-api-access-8x2sh\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.210023 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-config-data\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.210172 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.210413 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-run-httpd\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.210722 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-log-httpd\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.210990 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.211189 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-scripts\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.215042 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.215579 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.216043 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.216465 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-config-data\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.221161 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-scripts\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.239135 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x2sh\" (UniqueName: \"kubernetes.io/projected/940c2849-ce30-473c-9a55-b4fc35309bb7-kube-api-access-8x2sh\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.378829 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.799662 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.800118 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.859324 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:00 crc kubenswrapper[4816]: I0311 12:21:00.145845 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" path="/var/lib/kubelet/pods/1a9b124c-68d8-44e9-9381-fa448155ef23/volumes" Mar 11 12:21:00 crc kubenswrapper[4816]: I0311 12:21:00.695879 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerStarted","Data":"f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff"} Mar 11 12:21:00 crc kubenswrapper[4816]: I0311 12:21:00.696377 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerStarted","Data":"27271a41ca8a8c294fc6aef8e91d6f6174e9af95c25d74da01a3a36774d13fba"} Mar 11 12:21:00 crc kubenswrapper[4816]: I0311 12:21:00.812479 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:00 crc kubenswrapper[4816]: I0311 12:21:00.812550 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:01 crc kubenswrapper[4816]: I0311 12:21:01.713726 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerStarted","Data":"5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b"} Mar 11 12:21:01 crc kubenswrapper[4816]: I0311 12:21:01.890641 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 11 12:21:01 crc kubenswrapper[4816]: I0311 12:21:01.925565 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 11 12:21:02 crc kubenswrapper[4816]: I0311 12:21:02.728343 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerStarted","Data":"f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec"} Mar 11 12:21:02 crc kubenswrapper[4816]: I0311 12:21:02.774228 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 11 12:21:02 crc kubenswrapper[4816]: I0311 12:21:02.918779 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 12:21:02 crc kubenswrapper[4816]: I0311 12:21:02.918843 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 12:21:04 crc kubenswrapper[4816]: I0311 12:21:04.001581 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:04 crc kubenswrapper[4816]: I0311 12:21:04.001601 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:04 crc kubenswrapper[4816]: I0311 12:21:04.751228 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerStarted","Data":"692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763"} Mar 11 12:21:04 crc kubenswrapper[4816]: I0311 12:21:04.753213 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:21:04 crc kubenswrapper[4816]: I0311 12:21:04.780241 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.752034614 podStartE2EDuration="5.780213425s" podCreationTimestamp="2026-03-11 12:20:59 +0000 UTC" firstStartedPulling="2026-03-11 12:20:59.858585535 +0000 UTC m=+1346.449849502" lastFinishedPulling="2026-03-11 12:21:03.886764346 +0000 UTC m=+1350.478028313" observedRunningTime="2026-03-11 12:21:04.774267395 +0000 UTC m=+1351.365531362" watchObservedRunningTime="2026-03-11 12:21:04.780213425 +0000 UTC m=+1351.371477392" Mar 11 12:21:05 crc kubenswrapper[4816]: I0311 12:21:05.021849 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.689472 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.762934 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-combined-ca-bundle\") pod \"33811121-46de-4941-bd74-18ecaa2c2827\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.763168 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-config-data\") pod \"33811121-46de-4941-bd74-18ecaa2c2827\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.763283 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4xh6\" (UniqueName: \"kubernetes.io/projected/33811121-46de-4941-bd74-18ecaa2c2827-kube-api-access-t4xh6\") pod \"33811121-46de-4941-bd74-18ecaa2c2827\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.773625 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33811121-46de-4941-bd74-18ecaa2c2827-kube-api-access-t4xh6" (OuterVolumeSpecName: "kube-api-access-t4xh6") pod "33811121-46de-4941-bd74-18ecaa2c2827" (UID: "33811121-46de-4941-bd74-18ecaa2c2827"). InnerVolumeSpecName "kube-api-access-t4xh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.800267 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-config-data" (OuterVolumeSpecName: "config-data") pod "33811121-46de-4941-bd74-18ecaa2c2827" (UID: "33811121-46de-4941-bd74-18ecaa2c2827"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.801398 4816 generic.go:334] "Generic (PLEG): container finished" podID="33811121-46de-4941-bd74-18ecaa2c2827" containerID="63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0" exitCode=137 Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.801454 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"33811121-46de-4941-bd74-18ecaa2c2827","Type":"ContainerDied","Data":"63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0"} Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.801495 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"33811121-46de-4941-bd74-18ecaa2c2827","Type":"ContainerDied","Data":"cc7188d0b18641404663ed171ec3812667a2d4778de79e666b89e8d42f9ec1e9"} Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.801514 4816 scope.go:117] "RemoveContainer" containerID="63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.801763 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.809034 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33811121-46de-4941-bd74-18ecaa2c2827" (UID: "33811121-46de-4941-bd74-18ecaa2c2827"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.865523 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.865575 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4xh6\" (UniqueName: \"kubernetes.io/projected/33811121-46de-4941-bd74-18ecaa2c2827-kube-api-access-t4xh6\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.865589 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.886812 4816 scope.go:117] "RemoveContainer" containerID="63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0" Mar 11 12:21:08 crc kubenswrapper[4816]: E0311 12:21:08.887345 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0\": container with ID starting with 63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0 not found: ID does not exist" containerID="63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.887424 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0"} err="failed to get container status \"63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0\": rpc error: code = NotFound desc = could not find container \"63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0\": container with ID starting with 63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0 not found: ID does not exist" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.154185 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.168158 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.203343 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:21:09 crc kubenswrapper[4816]: E0311 12:21:09.207514 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33811121-46de-4941-bd74-18ecaa2c2827" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.207577 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="33811121-46de-4941-bd74-18ecaa2c2827" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.208188 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="33811121-46de-4941-bd74-18ecaa2c2827" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.209895 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.220143 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.220492 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.220797 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.227334 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.277834 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkd68\" (UniqueName: \"kubernetes.io/projected/fd796be0-d1ac-47be-8162-3b1c42febc0a-kube-api-access-jkd68\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.277941 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.278379 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.278455 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.278494 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.381835 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.381946 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.381988 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.382040 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkd68\" (UniqueName: \"kubernetes.io/projected/fd796be0-d1ac-47be-8162-3b1c42febc0a-kube-api-access-jkd68\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.382085 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.388676 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.390478 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.390991 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.400325 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.400797 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkd68\" (UniqueName: \"kubernetes.io/projected/fd796be0-d1ac-47be-8162-3b1c42febc0a-kube-api-access-jkd68\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.514679 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.514737 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.554580 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.813128 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.814888 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.831980 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 11 12:21:10 crc kubenswrapper[4816]: I0311 12:21:10.071586 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:21:10 crc kubenswrapper[4816]: I0311 12:21:10.162492 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33811121-46de-4941-bd74-18ecaa2c2827" path="/var/lib/kubelet/pods/33811121-46de-4941-bd74-18ecaa2c2827/volumes" Mar 11 12:21:10 crc kubenswrapper[4816]: I0311 12:21:10.831846 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fd796be0-d1ac-47be-8162-3b1c42febc0a","Type":"ContainerStarted","Data":"de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e"} Mar 11 12:21:10 crc kubenswrapper[4816]: I0311 12:21:10.831925 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fd796be0-d1ac-47be-8162-3b1c42febc0a","Type":"ContainerStarted","Data":"73799c30d5d3ab5fe26ad3cf5939299dea4d34493e455f7bcdac484f34941957"} Mar 11 12:21:10 crc kubenswrapper[4816]: I0311 12:21:10.858120 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.858089653 podStartE2EDuration="1.858089653s" podCreationTimestamp="2026-03-11 12:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:21:10.852105612 +0000 UTC m=+1357.443369579" watchObservedRunningTime="2026-03-11 12:21:10.858089653 +0000 UTC m=+1357.449353650" Mar 11 12:21:10 crc kubenswrapper[4816]: I0311 12:21:10.860343 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 11 12:21:12 crc kubenswrapper[4816]: I0311 12:21:12.924807 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 12:21:12 crc kubenswrapper[4816]: I0311 12:21:12.925997 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 12:21:12 crc kubenswrapper[4816]: I0311 12:21:12.926911 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 12:21:12 crc kubenswrapper[4816]: I0311 12:21:12.927195 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 12:21:12 crc kubenswrapper[4816]: I0311 12:21:12.930816 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 12:21:12 crc kubenswrapper[4816]: I0311 12:21:12.932556 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.203897 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-7h7r8"] Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.206410 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.223852 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-7h7r8"] Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.294289 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.294531 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.294626 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-config\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.294766 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.294886 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.295020 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfg5z\" (UniqueName: \"kubernetes.io/projected/32a279c7-00a8-4e98-8356-91e219416a22-kube-api-access-jfg5z\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.396209 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.396346 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.396413 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfg5z\" (UniqueName: \"kubernetes.io/projected/32a279c7-00a8-4e98-8356-91e219416a22-kube-api-access-jfg5z\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.396455 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.396499 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.396522 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-config\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.397699 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-config\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.397713 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.397824 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.398139 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.398479 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.417694 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfg5z\" (UniqueName: \"kubernetes.io/projected/32a279c7-00a8-4e98-8356-91e219416a22-kube-api-access-jfg5z\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.542971 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:14 crc kubenswrapper[4816]: I0311 12:21:14.099292 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-7h7r8"] Mar 11 12:21:14 crc kubenswrapper[4816]: I0311 12:21:14.555675 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:14 crc kubenswrapper[4816]: I0311 12:21:14.878145 4816 generic.go:334] "Generic (PLEG): container finished" podID="32a279c7-00a8-4e98-8356-91e219416a22" containerID="1876def9a0f72b0ad981ff600f29fd745c0daa03affcc6a0a2083718b834badc" exitCode=0 Mar 11 12:21:14 crc kubenswrapper[4816]: I0311 12:21:14.878423 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" event={"ID":"32a279c7-00a8-4e98-8356-91e219416a22","Type":"ContainerDied","Data":"1876def9a0f72b0ad981ff600f29fd745c0daa03affcc6a0a2083718b834badc"} Mar 11 12:21:14 crc kubenswrapper[4816]: I0311 12:21:14.880001 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" event={"ID":"32a279c7-00a8-4e98-8356-91e219416a22","Type":"ContainerStarted","Data":"88f0e5edf59a2c15eb9814f01d499e770f690a88f8bf62d0decdbb14e939c9e6"} Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.321360 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.322098 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-central-agent" containerID="cri-o://f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff" gracePeriod=30 Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.322158 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="proxy-httpd" containerID="cri-o://692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763" gracePeriod=30 Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.322223 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-notification-agent" containerID="cri-o://5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b" gracePeriod=30 Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.322158 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="sg-core" containerID="cri-o://f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec" gracePeriod=30 Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.339686 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.948501 4816 generic.go:334] "Generic (PLEG): container finished" podID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerID="692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763" exitCode=0 Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.948548 4816 generic.go:334] "Generic (PLEG): container finished" podID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerID="f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec" exitCode=2 Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.948561 4816 generic.go:334] "Generic (PLEG): container finished" podID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerID="f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff" exitCode=0 Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.948643 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerDied","Data":"692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763"} Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.948681 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerDied","Data":"f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec"} Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.948695 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerDied","Data":"f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff"} Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.966270 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" event={"ID":"32a279c7-00a8-4e98-8356-91e219416a22","Type":"ContainerStarted","Data":"373cac1249bba137b237fe973a3b7880bfcca6318c8db162f6ca4526fa918835"} Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.968330 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:16 crc kubenswrapper[4816]: E0311 12:21:16.095383 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod940c2849_ce30_473c_9a55_b4fc35309bb7.slice/crio-5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b.scope\": RecentStats: unable to find data in memory cache]" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.865800 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" podStartSLOduration=3.865776857 podStartE2EDuration="3.865776857s" podCreationTimestamp="2026-03-11 12:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:21:16.002745143 +0000 UTC m=+1362.594009100" watchObservedRunningTime="2026-03-11 12:21:16.865776857 +0000 UTC m=+1363.457040824" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.872554 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.872864 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-log" containerID="cri-o://ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9" gracePeriod=30 Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.873450 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-api" containerID="cri-o://e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a" gracePeriod=30 Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.887835 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.926271 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x2sh\" (UniqueName: \"kubernetes.io/projected/940c2849-ce30-473c-9a55-b4fc35309bb7-kube-api-access-8x2sh\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.926441 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-ceilometer-tls-certs\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.926486 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-combined-ca-bundle\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.926572 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-run-httpd\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.926650 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-sg-core-conf-yaml\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.926676 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-scripts\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.934316 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-config-data\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.934430 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-log-httpd\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.936646 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.937102 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.943817 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/940c2849-ce30-473c-9a55-b4fc35309bb7-kube-api-access-8x2sh" (OuterVolumeSpecName: "kube-api-access-8x2sh") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "kube-api-access-8x2sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.950944 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-scripts" (OuterVolumeSpecName: "scripts") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.992813 4816 generic.go:334] "Generic (PLEG): container finished" podID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerID="5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b" exitCode=0 Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.993344 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.994144 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerDied","Data":"5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b"} Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.994183 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerDied","Data":"27271a41ca8a8c294fc6aef8e91d6f6174e9af95c25d74da01a3a36774d13fba"} Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.994205 4816 scope.go:117] "RemoveContainer" containerID="692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.012501 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.038758 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.038800 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.038811 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.038824 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.038832 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x2sh\" (UniqueName: \"kubernetes.io/projected/940c2849-ce30-473c-9a55-b4fc35309bb7-kube-api-access-8x2sh\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.063120 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.070188 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.080820 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-config-data" (OuterVolumeSpecName: "config-data") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.116765 4816 scope.go:117] "RemoveContainer" containerID="f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.141152 4816 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.141324 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.141418 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.141224 4816 scope.go:117] "RemoveContainer" containerID="5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.168506 4816 scope.go:117] "RemoveContainer" containerID="f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.193459 4816 scope.go:117] "RemoveContainer" containerID="692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763" Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.194080 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763\": container with ID starting with 692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763 not found: ID does not exist" containerID="692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.194136 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763"} err="failed to get container status \"692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763\": rpc error: code = NotFound desc = could not find container \"692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763\": container with ID starting with 692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763 not found: ID does not exist" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.194169 4816 scope.go:117] "RemoveContainer" containerID="f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec" Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.194735 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec\": container with ID starting with f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec not found: ID does not exist" containerID="f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.194860 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec"} err="failed to get container status \"f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec\": rpc error: code = NotFound desc = could not find container \"f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec\": container with ID starting with f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec not found: ID does not exist" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.194973 4816 scope.go:117] "RemoveContainer" containerID="5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b" Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.195365 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b\": container with ID starting with 5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b not found: ID does not exist" containerID="5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.195410 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b"} err="failed to get container status \"5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b\": rpc error: code = NotFound desc = could not find container \"5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b\": container with ID starting with 5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b not found: ID does not exist" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.195471 4816 scope.go:117] "RemoveContainer" containerID="f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff" Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.196000 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff\": container with ID starting with f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff not found: ID does not exist" containerID="f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.196021 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff"} err="failed to get container status \"f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff\": rpc error: code = NotFound desc = could not find container \"f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff\": container with ID starting with f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff not found: ID does not exist" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.333757 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.346163 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.366146 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.366716 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="proxy-httpd" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.366741 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="proxy-httpd" Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.366752 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-central-agent" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.366759 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-central-agent" Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.366786 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="sg-core" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.366793 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="sg-core" Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.366813 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-notification-agent" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.366820 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-notification-agent" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.367012 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-notification-agent" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.367025 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="sg-core" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.367036 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-central-agent" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.367044 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="proxy-httpd" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.369037 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.376281 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.376565 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.376914 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.392082 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.447748 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-log-httpd\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.447857 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.448068 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-config-data\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.448190 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-scripts\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.448383 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2hpm\" (UniqueName: \"kubernetes.io/projected/389a1019-c47b-449b-ac46-f0271ba70c0b-kube-api-access-h2hpm\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.448470 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.448708 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.448944 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-run-httpd\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.550956 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.551030 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-config-data\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.551054 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-scripts\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.551093 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2hpm\" (UniqueName: \"kubernetes.io/projected/389a1019-c47b-449b-ac46-f0271ba70c0b-kube-api-access-h2hpm\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.551120 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.551188 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.551304 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-run-httpd\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.551375 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-log-httpd\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.552211 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-log-httpd\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.552299 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-run-httpd\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.558086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.558375 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.560378 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-scripts\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.560715 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.562001 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-config-data\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.572872 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2hpm\" (UniqueName: \"kubernetes.io/projected/389a1019-c47b-449b-ac46-f0271ba70c0b-kube-api-access-h2hpm\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.706905 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:18 crc kubenswrapper[4816]: I0311 12:21:18.006365 4816 generic.go:334] "Generic (PLEG): container finished" podID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerID="ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9" exitCode=143 Mar 11 12:21:18 crc kubenswrapper[4816]: I0311 12:21:18.006473 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b44498c-88b3-42e4-b8cd-322579c29a3e","Type":"ContainerDied","Data":"ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9"} Mar 11 12:21:18 crc kubenswrapper[4816]: I0311 12:21:18.145093 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" path="/var/lib/kubelet/pods/940c2849-ce30-473c-9a55-b4fc35309bb7/volumes" Mar 11 12:21:18 crc kubenswrapper[4816]: I0311 12:21:18.262831 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:18 crc kubenswrapper[4816]: W0311 12:21:18.263275 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod389a1019_c47b_449b_ac46_f0271ba70c0b.slice/crio-831f866c847ed0c4a4e75849b87d63d375222a7a188ecc44b5169bd7010ae778 WatchSource:0}: Error finding container 831f866c847ed0c4a4e75849b87d63d375222a7a188ecc44b5169bd7010ae778: Status 404 returned error can't find the container with id 831f866c847ed0c4a4e75849b87d63d375222a7a188ecc44b5169bd7010ae778 Mar 11 12:21:19 crc kubenswrapper[4816]: I0311 12:21:19.029026 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerStarted","Data":"831f866c847ed0c4a4e75849b87d63d375222a7a188ecc44b5169bd7010ae778"} Mar 11 12:21:19 crc kubenswrapper[4816]: I0311 12:21:19.478370 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:19 crc kubenswrapper[4816]: I0311 12:21:19.555478 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:19 crc kubenswrapper[4816]: I0311 12:21:19.593214 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.043170 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerStarted","Data":"3c33e4e96ad95d72d477f29cdd83ed17043f7147e78c943eda376d648d31d9b9"} Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.043671 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerStarted","Data":"86830a024f06c2328c22d5b921acb795fe3b73a3eede1e8d875dfee3806bd2ea"} Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.067010 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.362334 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-wsfdf"] Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.365944 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.370839 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.371928 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.382707 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wsfdf"] Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.425892 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7hxl\" (UniqueName: \"kubernetes.io/projected/36fadc66-c846-46c0-a002-efeb7656f2b8-kube-api-access-v7hxl\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.426380 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.426453 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-config-data\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.426499 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-scripts\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.528311 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.528446 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-config-data\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.528482 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-scripts\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.528550 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7hxl\" (UniqueName: \"kubernetes.io/projected/36fadc66-c846-46c0-a002-efeb7656f2b8-kube-api-access-v7hxl\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.533286 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-scripts\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.535900 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.537680 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-config-data\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.548023 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.554330 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7hxl\" (UniqueName: \"kubernetes.io/projected/36fadc66-c846-46c0-a002-efeb7656f2b8-kube-api-access-v7hxl\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.630712 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b44498c-88b3-42e4-b8cd-322579c29a3e-logs\") pod \"8b44498c-88b3-42e4-b8cd-322579c29a3e\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.630886 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-config-data\") pod \"8b44498c-88b3-42e4-b8cd-322579c29a3e\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.631028 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42kkf\" (UniqueName: \"kubernetes.io/projected/8b44498c-88b3-42e4-b8cd-322579c29a3e-kube-api-access-42kkf\") pod \"8b44498c-88b3-42e4-b8cd-322579c29a3e\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.631157 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-combined-ca-bundle\") pod \"8b44498c-88b3-42e4-b8cd-322579c29a3e\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.634922 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b44498c-88b3-42e4-b8cd-322579c29a3e-logs" (OuterVolumeSpecName: "logs") pod "8b44498c-88b3-42e4-b8cd-322579c29a3e" (UID: "8b44498c-88b3-42e4-b8cd-322579c29a3e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.646582 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b44498c-88b3-42e4-b8cd-322579c29a3e-kube-api-access-42kkf" (OuterVolumeSpecName: "kube-api-access-42kkf") pod "8b44498c-88b3-42e4-b8cd-322579c29a3e" (UID: "8b44498c-88b3-42e4-b8cd-322579c29a3e"). InnerVolumeSpecName "kube-api-access-42kkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.667922 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-config-data" (OuterVolumeSpecName: "config-data") pod "8b44498c-88b3-42e4-b8cd-322579c29a3e" (UID: "8b44498c-88b3-42e4-b8cd-322579c29a3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.681423 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b44498c-88b3-42e4-b8cd-322579c29a3e" (UID: "8b44498c-88b3-42e4-b8cd-322579c29a3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.734160 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b44498c-88b3-42e4-b8cd-322579c29a3e-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.734194 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.734205 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42kkf\" (UniqueName: \"kubernetes.io/projected/8b44498c-88b3-42e4-b8cd-322579c29a3e-kube-api-access-42kkf\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.734216 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.808679 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.073056 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerStarted","Data":"189218a0b9eca174d0a87d53dc63a64ae6c4741afd3bf140d2544540d81d6125"} Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.077389 4816 generic.go:334] "Generic (PLEG): container finished" podID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerID="e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a" exitCode=0 Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.077505 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.077510 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b44498c-88b3-42e4-b8cd-322579c29a3e","Type":"ContainerDied","Data":"e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a"} Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.077586 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b44498c-88b3-42e4-b8cd-322579c29a3e","Type":"ContainerDied","Data":"3161af8bf1555f75f7e3fe8c5b6c7028f30e608b5088c5375d09c6a61566d4c9"} Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.077613 4816 scope.go:117] "RemoveContainer" containerID="e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.120388 4816 scope.go:117] "RemoveContainer" containerID="ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.145343 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.174905 4816 scope.go:117] "RemoveContainer" containerID="e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a" Mar 11 12:21:22 crc kubenswrapper[4816]: E0311 12:21:21.175467 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a\": container with ID starting with e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a not found: ID does not exist" containerID="e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.175521 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a"} err="failed to get container status \"e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a\": rpc error: code = NotFound desc = could not find container \"e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a\": container with ID starting with e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a not found: ID does not exist" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.175566 4816 scope.go:117] "RemoveContainer" containerID="ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9" Mar 11 12:21:22 crc kubenswrapper[4816]: E0311 12:21:21.175874 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9\": container with ID starting with ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9 not found: ID does not exist" containerID="ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.175889 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9"} err="failed to get container status \"ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9\": rpc error: code = NotFound desc = could not find container \"ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9\": container with ID starting with ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9 not found: ID does not exist" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.176712 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.187354 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:22 crc kubenswrapper[4816]: E0311 12:21:21.187971 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-api" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.187991 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-api" Mar 11 12:21:22 crc kubenswrapper[4816]: E0311 12:21:21.188033 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-log" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.188040 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-log" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.188287 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-api" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.188308 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-log" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.189553 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.195342 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.245727 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8dh5\" (UniqueName: \"kubernetes.io/projected/7279e91c-fd54-4a52-a247-c5e38a231907-kube-api-access-w8dh5\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.245783 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.245813 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7279e91c-fd54-4a52-a247-c5e38a231907-logs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.245840 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.246038 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-config-data\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.246074 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-public-tls-certs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.348653 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8dh5\" (UniqueName: \"kubernetes.io/projected/7279e91c-fd54-4a52-a247-c5e38a231907-kube-api-access-w8dh5\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.349152 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.349183 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7279e91c-fd54-4a52-a247-c5e38a231907-logs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.349208 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.349281 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-config-data\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.349316 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-public-tls-certs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.350510 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7279e91c-fd54-4a52-a247-c5e38a231907-logs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.363730 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.381945 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.382240 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.382409 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.393740 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-public-tls-certs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.395861 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.396333 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-config-data\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.416918 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8dh5\" (UniqueName: \"kubernetes.io/projected/7279e91c-fd54-4a52-a247-c5e38a231907-kube-api-access-w8dh5\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.462855 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wsfdf"] Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.507928 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:22.103392 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wsfdf" event={"ID":"36fadc66-c846-46c0-a002-efeb7656f2b8","Type":"ContainerStarted","Data":"adf484d20700d25957189d351eb669acaae4683a20326267761afe30c6a7e50c"} Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:22.103907 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wsfdf" event={"ID":"36fadc66-c846-46c0-a002-efeb7656f2b8","Type":"ContainerStarted","Data":"f1ca638575d3d6823fa339abfb04a9bb46bfeaa2c8671cd04523b6370d4416be"} Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:22.126533 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-wsfdf" podStartSLOduration=2.12651104 podStartE2EDuration="2.12651104s" podCreationTimestamp="2026-03-11 12:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:21:22.124447441 +0000 UTC m=+1368.715711408" watchObservedRunningTime="2026-03-11 12:21:22.12651104 +0000 UTC m=+1368.717775007" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:22.145591 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" path="/var/lib/kubelet/pods/8b44498c-88b3-42e4-b8cd-322579c29a3e/volumes" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:22.501071 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.127349 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerStarted","Data":"d506ef0b1a20cb0137c7e713819501bcf9a1c0dd99e6ec71affc5c1084fb1441"} Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.128175 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-central-agent" containerID="cri-o://86830a024f06c2328c22d5b921acb795fe3b73a3eede1e8d875dfee3806bd2ea" gracePeriod=30 Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.128374 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="proxy-httpd" containerID="cri-o://d506ef0b1a20cb0137c7e713819501bcf9a1c0dd99e6ec71affc5c1084fb1441" gracePeriod=30 Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.128409 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-notification-agent" containerID="cri-o://3c33e4e96ad95d72d477f29cdd83ed17043f7147e78c943eda376d648d31d9b9" gracePeriod=30 Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.128491 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="sg-core" containerID="cri-o://189218a0b9eca174d0a87d53dc63a64ae6c4741afd3bf140d2544540d81d6125" gracePeriod=30 Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.128579 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.134771 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7279e91c-fd54-4a52-a247-c5e38a231907","Type":"ContainerStarted","Data":"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381"} Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.134855 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7279e91c-fd54-4a52-a247-c5e38a231907","Type":"ContainerStarted","Data":"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a"} Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.134878 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7279e91c-fd54-4a52-a247-c5e38a231907","Type":"ContainerStarted","Data":"1479f433dd0af53602fa7b4358ac16a8893fc6ce4f3fc3758931ae0187bafc3e"} Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.174542 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.851458799 podStartE2EDuration="6.174513366s" podCreationTimestamp="2026-03-11 12:21:17 +0000 UTC" firstStartedPulling="2026-03-11 12:21:18.266427899 +0000 UTC m=+1364.857691866" lastFinishedPulling="2026-03-11 12:21:22.589482466 +0000 UTC m=+1369.180746433" observedRunningTime="2026-03-11 12:21:23.158831786 +0000 UTC m=+1369.750095773" watchObservedRunningTime="2026-03-11 12:21:23.174513366 +0000 UTC m=+1369.765777343" Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.196701 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.1966701730000002 podStartE2EDuration="2.196670173s" podCreationTimestamp="2026-03-11 12:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:21:23.188274292 +0000 UTC m=+1369.779538249" watchObservedRunningTime="2026-03-11 12:21:23.196670173 +0000 UTC m=+1369.787934140" Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.545695 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.634094 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-bsnbn"] Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.635015 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" podUID="1370549e-42a3-450d-a28d-47d4a0764f56" containerName="dnsmasq-dns" containerID="cri-o://c5bd6af011971fbc8f1597775b66953c97da9697d9d038a8ec30a1201d7a28f6" gracePeriod=10 Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.152802 4816 generic.go:334] "Generic (PLEG): container finished" podID="1370549e-42a3-450d-a28d-47d4a0764f56" containerID="c5bd6af011971fbc8f1597775b66953c97da9697d9d038a8ec30a1201d7a28f6" exitCode=0 Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.166683 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" event={"ID":"1370549e-42a3-450d-a28d-47d4a0764f56","Type":"ContainerDied","Data":"c5bd6af011971fbc8f1597775b66953c97da9697d9d038a8ec30a1201d7a28f6"} Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.166741 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" event={"ID":"1370549e-42a3-450d-a28d-47d4a0764f56","Type":"ContainerDied","Data":"31af9c3f8588e04ef95c2018485a2e29382e231f1e73609996971ecefe64ea0d"} Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.166757 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31af9c3f8588e04ef95c2018485a2e29382e231f1e73609996971ecefe64ea0d" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.175470 4816 generic.go:334] "Generic (PLEG): container finished" podID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerID="d506ef0b1a20cb0137c7e713819501bcf9a1c0dd99e6ec71affc5c1084fb1441" exitCode=0 Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.175515 4816 generic.go:334] "Generic (PLEG): container finished" podID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerID="189218a0b9eca174d0a87d53dc63a64ae6c4741afd3bf140d2544540d81d6125" exitCode=2 Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.175526 4816 generic.go:334] "Generic (PLEG): container finished" podID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerID="3c33e4e96ad95d72d477f29cdd83ed17043f7147e78c943eda376d648d31d9b9" exitCode=0 Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.175731 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerDied","Data":"d506ef0b1a20cb0137c7e713819501bcf9a1c0dd99e6ec71affc5c1084fb1441"} Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.175801 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerDied","Data":"189218a0b9eca174d0a87d53dc63a64ae6c4741afd3bf140d2544540d81d6125"} Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.175814 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerDied","Data":"3c33e4e96ad95d72d477f29cdd83ed17043f7147e78c943eda376d648d31d9b9"} Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.216842 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.343602 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-nb\") pod \"1370549e-42a3-450d-a28d-47d4a0764f56\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.344113 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-config\") pod \"1370549e-42a3-450d-a28d-47d4a0764f56\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.344299 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-swift-storage-0\") pod \"1370549e-42a3-450d-a28d-47d4a0764f56\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.344323 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-svc\") pod \"1370549e-42a3-450d-a28d-47d4a0764f56\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.344371 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgzdl\" (UniqueName: \"kubernetes.io/projected/1370549e-42a3-450d-a28d-47d4a0764f56-kube-api-access-jgzdl\") pod \"1370549e-42a3-450d-a28d-47d4a0764f56\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.344444 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-sb\") pod \"1370549e-42a3-450d-a28d-47d4a0764f56\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.364481 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1370549e-42a3-450d-a28d-47d4a0764f56-kube-api-access-jgzdl" (OuterVolumeSpecName: "kube-api-access-jgzdl") pod "1370549e-42a3-450d-a28d-47d4a0764f56" (UID: "1370549e-42a3-450d-a28d-47d4a0764f56"). InnerVolumeSpecName "kube-api-access-jgzdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.407934 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1370549e-42a3-450d-a28d-47d4a0764f56" (UID: "1370549e-42a3-450d-a28d-47d4a0764f56"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.425617 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1370549e-42a3-450d-a28d-47d4a0764f56" (UID: "1370549e-42a3-450d-a28d-47d4a0764f56"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.431097 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-config" (OuterVolumeSpecName: "config") pod "1370549e-42a3-450d-a28d-47d4a0764f56" (UID: "1370549e-42a3-450d-a28d-47d4a0764f56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.441657 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1370549e-42a3-450d-a28d-47d4a0764f56" (UID: "1370549e-42a3-450d-a28d-47d4a0764f56"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.447118 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.447242 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.447309 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.447375 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgzdl\" (UniqueName: \"kubernetes.io/projected/1370549e-42a3-450d-a28d-47d4a0764f56-kube-api-access-jgzdl\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.447426 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.460183 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1370549e-42a3-450d-a28d-47d4a0764f56" (UID: "1370549e-42a3-450d-a28d-47d4a0764f56"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.550043 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:25 crc kubenswrapper[4816]: I0311 12:21:25.187060 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:21:25 crc kubenswrapper[4816]: I0311 12:21:25.234348 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-bsnbn"] Mar 11 12:21:25 crc kubenswrapper[4816]: I0311 12:21:25.248203 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-bsnbn"] Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.159850 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1370549e-42a3-450d-a28d-47d4a0764f56" path="/var/lib/kubelet/pods/1370549e-42a3-450d-a28d-47d4a0764f56/volumes" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.207870 4816 generic.go:334] "Generic (PLEG): container finished" podID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerID="86830a024f06c2328c22d5b921acb795fe3b73a3eede1e8d875dfee3806bd2ea" exitCode=0 Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.208107 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerDied","Data":"86830a024f06c2328c22d5b921acb795fe3b73a3eede1e8d875dfee3806bd2ea"} Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.513745 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596394 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-log-httpd\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596489 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-combined-ca-bundle\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596562 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2hpm\" (UniqueName: \"kubernetes.io/projected/389a1019-c47b-449b-ac46-f0271ba70c0b-kube-api-access-h2hpm\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596734 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-run-httpd\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596784 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-config-data\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596819 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-sg-core-conf-yaml\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596851 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-scripts\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596879 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-ceilometer-tls-certs\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.597938 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.598171 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.608328 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389a1019-c47b-449b-ac46-f0271ba70c0b-kube-api-access-h2hpm" (OuterVolumeSpecName: "kube-api-access-h2hpm") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "kube-api-access-h2hpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.616106 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-scripts" (OuterVolumeSpecName: "scripts") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.633827 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.659437 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.685783 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.699862 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.699919 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.699943 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2hpm\" (UniqueName: \"kubernetes.io/projected/389a1019-c47b-449b-ac46-f0271ba70c0b-kube-api-access-h2hpm\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.699980 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.699999 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.700018 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.700034 4816 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.723540 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-config-data" (OuterVolumeSpecName: "config-data") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.802188 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.226285 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerDied","Data":"831f866c847ed0c4a4e75849b87d63d375222a7a188ecc44b5169bd7010ae778"} Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.226773 4816 scope.go:117] "RemoveContainer" containerID="d506ef0b1a20cb0137c7e713819501bcf9a1c0dd99e6ec71affc5c1084fb1441" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.226377 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.264538 4816 scope.go:117] "RemoveContainer" containerID="189218a0b9eca174d0a87d53dc63a64ae6c4741afd3bf140d2544540d81d6125" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.290068 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.303267 4816 scope.go:117] "RemoveContainer" containerID="3c33e4e96ad95d72d477f29cdd83ed17043f7147e78c943eda376d648d31d9b9" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.305376 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.324030 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:27 crc kubenswrapper[4816]: E0311 12:21:27.324652 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="proxy-httpd" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.324679 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="proxy-httpd" Mar 11 12:21:27 crc kubenswrapper[4816]: E0311 12:21:27.324708 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-notification-agent" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.324718 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-notification-agent" Mar 11 12:21:27 crc kubenswrapper[4816]: E0311 12:21:27.324730 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1370549e-42a3-450d-a28d-47d4a0764f56" containerName="dnsmasq-dns" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.324739 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1370549e-42a3-450d-a28d-47d4a0764f56" containerName="dnsmasq-dns" Mar 11 12:21:27 crc kubenswrapper[4816]: E0311 12:21:27.324763 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-central-agent" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.324771 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-central-agent" Mar 11 12:21:27 crc kubenswrapper[4816]: E0311 12:21:27.324800 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="sg-core" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.324808 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="sg-core" Mar 11 12:21:27 crc kubenswrapper[4816]: E0311 12:21:27.324825 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1370549e-42a3-450d-a28d-47d4a0764f56" containerName="init" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.324833 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1370549e-42a3-450d-a28d-47d4a0764f56" containerName="init" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.325030 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="proxy-httpd" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.325052 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1370549e-42a3-450d-a28d-47d4a0764f56" containerName="dnsmasq-dns" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.325070 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-notification-agent" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.325080 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-central-agent" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.325099 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="sg-core" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.327778 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.330564 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.333767 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.337446 4816 scope.go:117] "RemoveContainer" containerID="86830a024f06c2328c22d5b921acb795fe3b73a3eede1e8d875dfee3806bd2ea" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.340763 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.363958 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417268 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-config-data\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417348 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417379 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-scripts\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417396 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-log-httpd\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417430 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5r6l\" (UniqueName: \"kubernetes.io/projected/bedb612d-0e22-4025-9151-d0cf7bc4ee42-kube-api-access-g5r6l\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417466 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-run-httpd\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417896 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417995 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519624 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-run-httpd\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519771 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519805 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519827 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-config-data\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519862 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519883 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-scripts\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519899 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-log-httpd\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519929 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5r6l\" (UniqueName: \"kubernetes.io/projected/bedb612d-0e22-4025-9151-d0cf7bc4ee42-kube-api-access-g5r6l\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.521471 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-run-httpd\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.521494 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-log-httpd\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.525921 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.525923 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.527626 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-config-data\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.527645 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.541605 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-scripts\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.543419 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5r6l\" (UniqueName: \"kubernetes.io/projected/bedb612d-0e22-4025-9151-d0cf7bc4ee42-kube-api-access-g5r6l\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.650419 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:28 crc kubenswrapper[4816]: I0311 12:21:28.159923 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" path="/var/lib/kubelet/pods/389a1019-c47b-449b-ac46-f0271ba70c0b/volumes" Mar 11 12:21:28 crc kubenswrapper[4816]: I0311 12:21:28.163845 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:28 crc kubenswrapper[4816]: I0311 12:21:28.244990 4816 generic.go:334] "Generic (PLEG): container finished" podID="36fadc66-c846-46c0-a002-efeb7656f2b8" containerID="adf484d20700d25957189d351eb669acaae4683a20326267761afe30c6a7e50c" exitCode=0 Mar 11 12:21:28 crc kubenswrapper[4816]: I0311 12:21:28.245303 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wsfdf" event={"ID":"36fadc66-c846-46c0-a002-efeb7656f2b8","Type":"ContainerDied","Data":"adf484d20700d25957189d351eb669acaae4683a20326267761afe30c6a7e50c"} Mar 11 12:21:28 crc kubenswrapper[4816]: I0311 12:21:28.247798 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerStarted","Data":"c1f12afb3ed2335d5b28ac089b50b4a7d4f0e38f3d3c1e7e1f537108eabd58b9"} Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.272049 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerStarted","Data":"84c64e2c11b5a33088d3e50d684b62246b9937fb898429fa525cc6fb739d9015"} Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.720027 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.785030 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7hxl\" (UniqueName: \"kubernetes.io/projected/36fadc66-c846-46c0-a002-efeb7656f2b8-kube-api-access-v7hxl\") pod \"36fadc66-c846-46c0-a002-efeb7656f2b8\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.785302 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-combined-ca-bundle\") pod \"36fadc66-c846-46c0-a002-efeb7656f2b8\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.785339 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-config-data\") pod \"36fadc66-c846-46c0-a002-efeb7656f2b8\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.785372 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-scripts\") pod \"36fadc66-c846-46c0-a002-efeb7656f2b8\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.791234 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36fadc66-c846-46c0-a002-efeb7656f2b8-kube-api-access-v7hxl" (OuterVolumeSpecName: "kube-api-access-v7hxl") pod "36fadc66-c846-46c0-a002-efeb7656f2b8" (UID: "36fadc66-c846-46c0-a002-efeb7656f2b8"). InnerVolumeSpecName "kube-api-access-v7hxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.814896 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-scripts" (OuterVolumeSpecName: "scripts") pod "36fadc66-c846-46c0-a002-efeb7656f2b8" (UID: "36fadc66-c846-46c0-a002-efeb7656f2b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.846379 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-config-data" (OuterVolumeSpecName: "config-data") pod "36fadc66-c846-46c0-a002-efeb7656f2b8" (UID: "36fadc66-c846-46c0-a002-efeb7656f2b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.846488 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36fadc66-c846-46c0-a002-efeb7656f2b8" (UID: "36fadc66-c846-46c0-a002-efeb7656f2b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.888045 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.888086 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.888101 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.888113 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7hxl\" (UniqueName: \"kubernetes.io/projected/36fadc66-c846-46c0-a002-efeb7656f2b8-kube-api-access-v7hxl\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.294899 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerStarted","Data":"a8b3b1241d87a2bc94cda4c45011262eeb879b9fb212362f754599d92ce27242"} Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.297232 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wsfdf" event={"ID":"36fadc66-c846-46c0-a002-efeb7656f2b8","Type":"ContainerDied","Data":"f1ca638575d3d6823fa339abfb04a9bb46bfeaa2c8671cd04523b6370d4416be"} Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.297284 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1ca638575d3d6823fa339abfb04a9bb46bfeaa2c8671cd04523b6370d4416be" Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.297376 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.454144 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.454504 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-log" containerID="cri-o://744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a" gracePeriod=30 Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.454542 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-api" containerID="cri-o://fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381" gracePeriod=30 Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.479630 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.482417 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="49c3f447-334e-4147-b877-22a0ce6e3345" containerName="nova-scheduler-scheduler" containerID="cri-o://4ce5b26a7642dbb3c4b2d4c21f23040b0afe51a33212a23837a87602f659ac7d" gracePeriod=30 Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.596318 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.597167 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-log" containerID="cri-o://8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097" gracePeriod=30 Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.597411 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-metadata" containerID="cri-o://65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a" gracePeriod=30 Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.158782 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.215897 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-public-tls-certs\") pod \"7279e91c-fd54-4a52-a247-c5e38a231907\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.215988 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-internal-tls-certs\") pod \"7279e91c-fd54-4a52-a247-c5e38a231907\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.216130 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-config-data\") pod \"7279e91c-fd54-4a52-a247-c5e38a231907\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.216213 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7279e91c-fd54-4a52-a247-c5e38a231907-logs\") pod \"7279e91c-fd54-4a52-a247-c5e38a231907\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.216363 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8dh5\" (UniqueName: \"kubernetes.io/projected/7279e91c-fd54-4a52-a247-c5e38a231907-kube-api-access-w8dh5\") pod \"7279e91c-fd54-4a52-a247-c5e38a231907\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.216441 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-combined-ca-bundle\") pod \"7279e91c-fd54-4a52-a247-c5e38a231907\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.219021 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7279e91c-fd54-4a52-a247-c5e38a231907-logs" (OuterVolumeSpecName: "logs") pod "7279e91c-fd54-4a52-a247-c5e38a231907" (UID: "7279e91c-fd54-4a52-a247-c5e38a231907"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.225843 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7279e91c-fd54-4a52-a247-c5e38a231907-kube-api-access-w8dh5" (OuterVolumeSpecName: "kube-api-access-w8dh5") pod "7279e91c-fd54-4a52-a247-c5e38a231907" (UID: "7279e91c-fd54-4a52-a247-c5e38a231907"). InnerVolumeSpecName "kube-api-access-w8dh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.257683 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-config-data" (OuterVolumeSpecName: "config-data") pod "7279e91c-fd54-4a52-a247-c5e38a231907" (UID: "7279e91c-fd54-4a52-a247-c5e38a231907"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.259597 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7279e91c-fd54-4a52-a247-c5e38a231907" (UID: "7279e91c-fd54-4a52-a247-c5e38a231907"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.295507 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7279e91c-fd54-4a52-a247-c5e38a231907" (UID: "7279e91c-fd54-4a52-a247-c5e38a231907"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.297340 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7279e91c-fd54-4a52-a247-c5e38a231907" (UID: "7279e91c-fd54-4a52-a247-c5e38a231907"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.312303 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerStarted","Data":"824fd644293ef663ba362cace1b788aa52143866b3de49d3b2f15202714957b5"} Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.319556 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.319914 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.319993 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.320113 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.320186 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7279e91c-fd54-4a52-a247-c5e38a231907-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.320419 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8dh5\" (UniqueName: \"kubernetes.io/projected/7279e91c-fd54-4a52-a247-c5e38a231907-kube-api-access-w8dh5\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.323120 4816 generic.go:334] "Generic (PLEG): container finished" podID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerID="8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097" exitCode=143 Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.323202 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b8751c6-ef60-400a-b4e3-0042d63c2d83","Type":"ContainerDied","Data":"8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097"} Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.328530 4816 generic.go:334] "Generic (PLEG): container finished" podID="49c3f447-334e-4147-b877-22a0ce6e3345" containerID="4ce5b26a7642dbb3c4b2d4c21f23040b0afe51a33212a23837a87602f659ac7d" exitCode=0 Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.328626 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49c3f447-334e-4147-b877-22a0ce6e3345","Type":"ContainerDied","Data":"4ce5b26a7642dbb3c4b2d4c21f23040b0afe51a33212a23837a87602f659ac7d"} Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.334310 4816 generic.go:334] "Generic (PLEG): container finished" podID="7279e91c-fd54-4a52-a247-c5e38a231907" containerID="fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381" exitCode=0 Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.334336 4816 generic.go:334] "Generic (PLEG): container finished" podID="7279e91c-fd54-4a52-a247-c5e38a231907" containerID="744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a" exitCode=143 Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.334356 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7279e91c-fd54-4a52-a247-c5e38a231907","Type":"ContainerDied","Data":"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381"} Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.334380 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7279e91c-fd54-4a52-a247-c5e38a231907","Type":"ContainerDied","Data":"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a"} Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.334383 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.334402 4816 scope.go:117] "RemoveContainer" containerID="fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.334390 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7279e91c-fd54-4a52-a247-c5e38a231907","Type":"ContainerDied","Data":"1479f433dd0af53602fa7b4358ac16a8893fc6ce4f3fc3758931ae0187bafc3e"} Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.422938 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.435269 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.446389 4816 scope.go:117] "RemoveContainer" containerID="744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.452983 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:31 crc kubenswrapper[4816]: E0311 12:21:31.453548 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-log" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.453568 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-log" Mar 11 12:21:31 crc kubenswrapper[4816]: E0311 12:21:31.453594 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fadc66-c846-46c0-a002-efeb7656f2b8" containerName="nova-manage" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.453601 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fadc66-c846-46c0-a002-efeb7656f2b8" containerName="nova-manage" Mar 11 12:21:31 crc kubenswrapper[4816]: E0311 12:21:31.453622 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-api" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.453628 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-api" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.453877 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-api" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.453907 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="36fadc66-c846-46c0-a002-efeb7656f2b8" containerName="nova-manage" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.453922 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-log" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.455226 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.460069 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.460396 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.460542 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.475462 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.506324 4816 scope.go:117] "RemoveContainer" containerID="fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381" Mar 11 12:21:31 crc kubenswrapper[4816]: E0311 12:21:31.510139 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381\": container with ID starting with fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381 not found: ID does not exist" containerID="fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.510217 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381"} err="failed to get container status \"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381\": rpc error: code = NotFound desc = could not find container \"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381\": container with ID starting with fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381 not found: ID does not exist" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.510275 4816 scope.go:117] "RemoveContainer" containerID="744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a" Mar 11 12:21:31 crc kubenswrapper[4816]: E0311 12:21:31.512323 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a\": container with ID starting with 744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a not found: ID does not exist" containerID="744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.512374 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a"} err="failed to get container status \"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a\": rpc error: code = NotFound desc = could not find container \"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a\": container with ID starting with 744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a not found: ID does not exist" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.512398 4816 scope.go:117] "RemoveContainer" containerID="fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.513025 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381"} err="failed to get container status \"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381\": rpc error: code = NotFound desc = could not find container \"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381\": container with ID starting with fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381 not found: ID does not exist" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.513077 4816 scope.go:117] "RemoveContainer" containerID="744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.513432 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a"} err="failed to get container status \"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a\": rpc error: code = NotFound desc = could not find container \"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a\": container with ID starting with 744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a not found: ID does not exist" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.525050 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.525205 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-config-data\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.525241 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.525382 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqhs8\" (UniqueName: \"kubernetes.io/projected/d28745d2-082d-4c99-90f0-b6c4696fb1a2-kube-api-access-cqhs8\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.525419 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d28745d2-082d-4c99-90f0-b6c4696fb1a2-logs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.525460 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-public-tls-certs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.627449 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-config-data\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.628175 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.628272 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqhs8\" (UniqueName: \"kubernetes.io/projected/d28745d2-082d-4c99-90f0-b6c4696fb1a2-kube-api-access-cqhs8\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.628304 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d28745d2-082d-4c99-90f0-b6c4696fb1a2-logs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.628344 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-public-tls-certs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.628363 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.629456 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d28745d2-082d-4c99-90f0-b6c4696fb1a2-logs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.632512 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.632946 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.634272 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-config-data\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.636907 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-public-tls-certs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.660722 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqhs8\" (UniqueName: \"kubernetes.io/projected/d28745d2-082d-4c99-90f0-b6c4696fb1a2-kube-api-access-cqhs8\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.703481 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.735854 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-combined-ca-bundle\") pod \"49c3f447-334e-4147-b877-22a0ce6e3345\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.736235 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-config-data\") pod \"49c3f447-334e-4147-b877-22a0ce6e3345\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.736515 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84md9\" (UniqueName: \"kubernetes.io/projected/49c3f447-334e-4147-b877-22a0ce6e3345-kube-api-access-84md9\") pod \"49c3f447-334e-4147-b877-22a0ce6e3345\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.743331 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c3f447-334e-4147-b877-22a0ce6e3345-kube-api-access-84md9" (OuterVolumeSpecName: "kube-api-access-84md9") pod "49c3f447-334e-4147-b877-22a0ce6e3345" (UID: "49c3f447-334e-4147-b877-22a0ce6e3345"). InnerVolumeSpecName "kube-api-access-84md9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.770734 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-config-data" (OuterVolumeSpecName: "config-data") pod "49c3f447-334e-4147-b877-22a0ce6e3345" (UID: "49c3f447-334e-4147-b877-22a0ce6e3345"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.787631 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49c3f447-334e-4147-b877-22a0ce6e3345" (UID: "49c3f447-334e-4147-b877-22a0ce6e3345"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.794218 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.842129 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.842172 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.842182 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84md9\" (UniqueName: \"kubernetes.io/projected/49c3f447-334e-4147-b877-22a0ce6e3345-kube-api-access-84md9\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.141952 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" path="/var/lib/kubelet/pods/7279e91c-fd54-4a52-a247-c5e38a231907/volumes" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.277242 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:32 crc kubenswrapper[4816]: W0311 12:21:32.289500 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd28745d2_082d_4c99_90f0_b6c4696fb1a2.slice/crio-8a5fef237ae36daf657628ae1e951a8f33300f04ba146b0b7c82c1251a514014 WatchSource:0}: Error finding container 8a5fef237ae36daf657628ae1e951a8f33300f04ba146b0b7c82c1251a514014: Status 404 returned error can't find the container with id 8a5fef237ae36daf657628ae1e951a8f33300f04ba146b0b7c82c1251a514014 Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.351458 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d28745d2-082d-4c99-90f0-b6c4696fb1a2","Type":"ContainerStarted","Data":"8a5fef237ae36daf657628ae1e951a8f33300f04ba146b0b7c82c1251a514014"} Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.354018 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49c3f447-334e-4147-b877-22a0ce6e3345","Type":"ContainerDied","Data":"80c23e1f7785724059b85c847854192b3471a718a42ed80849445c1edfb1f7c4"} Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.354106 4816 scope.go:117] "RemoveContainer" containerID="4ce5b26a7642dbb3c4b2d4c21f23040b0afe51a33212a23837a87602f659ac7d" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.354145 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.398897 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.415516 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.428191 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:21:32 crc kubenswrapper[4816]: E0311 12:21:32.428785 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c3f447-334e-4147-b877-22a0ce6e3345" containerName="nova-scheduler-scheduler" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.429428 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c3f447-334e-4147-b877-22a0ce6e3345" containerName="nova-scheduler-scheduler" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.430226 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="49c3f447-334e-4147-b877-22a0ce6e3345" containerName="nova-scheduler-scheduler" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.431405 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.439218 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.445745 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.579632 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-config-data\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.579875 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.579926 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvr8x\" (UniqueName: \"kubernetes.io/projected/41f4b502-b85f-488c-b55b-27a31479df68-kube-api-access-cvr8x\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.682472 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-config-data\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.683072 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.683097 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvr8x\" (UniqueName: \"kubernetes.io/projected/41f4b502-b85f-488c-b55b-27a31479df68-kube-api-access-cvr8x\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.690416 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-config-data\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.690844 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.701917 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvr8x\" (UniqueName: \"kubernetes.io/projected/41f4b502-b85f-488c-b55b-27a31479df68-kube-api-access-cvr8x\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.756336 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.265603 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:21:33 crc kubenswrapper[4816]: W0311 12:21:33.271926 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41f4b502_b85f_488c_b55b_27a31479df68.slice/crio-0b682ef5a1cffec39d115886fe70f340b2f710836dc8fb73d7380c331ca3d440 WatchSource:0}: Error finding container 0b682ef5a1cffec39d115886fe70f340b2f710836dc8fb73d7380c331ca3d440: Status 404 returned error can't find the container with id 0b682ef5a1cffec39d115886fe70f340b2f710836dc8fb73d7380c331ca3d440 Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.370780 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"41f4b502-b85f-488c-b55b-27a31479df68","Type":"ContainerStarted","Data":"0b682ef5a1cffec39d115886fe70f340b2f710836dc8fb73d7380c331ca3d440"} Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.375412 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerStarted","Data":"a80048ca909856187d3fa5dac7b542ba5ca3c8dbcb582537e0f884c753db4809"} Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.376158 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.381541 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d28745d2-082d-4c99-90f0-b6c4696fb1a2","Type":"ContainerStarted","Data":"f7560f8d6f98f14204afbbce69a7ff86d5f07a2d1a84e68d20701b7c7e5ce84d"} Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.381599 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d28745d2-082d-4c99-90f0-b6c4696fb1a2","Type":"ContainerStarted","Data":"9fc317ca9311d71a32a61a06236255eddc3a32782036027513c2583e902eb2de"} Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.419823 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.643419558 podStartE2EDuration="6.419803313s" podCreationTimestamp="2026-03-11 12:21:27 +0000 UTC" firstStartedPulling="2026-03-11 12:21:28.174205934 +0000 UTC m=+1374.765469911" lastFinishedPulling="2026-03-11 12:21:32.950589699 +0000 UTC m=+1379.541853666" observedRunningTime="2026-03-11 12:21:33.404818773 +0000 UTC m=+1379.996082760" watchObservedRunningTime="2026-03-11 12:21:33.419803313 +0000 UTC m=+1380.011067280" Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.442717 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.442692891 podStartE2EDuration="2.442692891s" podCreationTimestamp="2026-03-11 12:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:21:33.437497982 +0000 UTC m=+1380.028761969" watchObservedRunningTime="2026-03-11 12:21:33.442692891 +0000 UTC m=+1380.033956848" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.155502 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c3f447-334e-4147-b877-22a0ce6e3345" path="/var/lib/kubelet/pods/49c3f447-334e-4147-b877-22a0ce6e3345/volumes" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.287625 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.328600 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-combined-ca-bundle\") pod \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.328714 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-config-data\") pod \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.329225 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8751c6-ef60-400a-b4e3-0042d63c2d83-logs\") pod \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.329383 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzfsx\" (UniqueName: \"kubernetes.io/projected/3b8751c6-ef60-400a-b4e3-0042d63c2d83-kube-api-access-bzfsx\") pod \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.329423 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-nova-metadata-tls-certs\") pod \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.335979 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b8751c6-ef60-400a-b4e3-0042d63c2d83-logs" (OuterVolumeSpecName: "logs") pod "3b8751c6-ef60-400a-b4e3-0042d63c2d83" (UID: "3b8751c6-ef60-400a-b4e3-0042d63c2d83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.363517 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b8751c6-ef60-400a-b4e3-0042d63c2d83-kube-api-access-bzfsx" (OuterVolumeSpecName: "kube-api-access-bzfsx") pod "3b8751c6-ef60-400a-b4e3-0042d63c2d83" (UID: "3b8751c6-ef60-400a-b4e3-0042d63c2d83"). InnerVolumeSpecName "kube-api-access-bzfsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.403934 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"41f4b502-b85f-488c-b55b-27a31479df68","Type":"ContainerStarted","Data":"60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744"} Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.414988 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-config-data" (OuterVolumeSpecName: "config-data") pod "3b8751c6-ef60-400a-b4e3-0042d63c2d83" (UID: "3b8751c6-ef60-400a-b4e3-0042d63c2d83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.417121 4816 generic.go:334] "Generic (PLEG): container finished" podID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerID="65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a" exitCode=0 Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.417438 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b8751c6-ef60-400a-b4e3-0042d63c2d83","Type":"ContainerDied","Data":"65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a"} Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.417501 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b8751c6-ef60-400a-b4e3-0042d63c2d83","Type":"ContainerDied","Data":"3d6f4a92fab1ae4820eecc5176239bbe544418957d5b8b49929c39dc6ee8800c"} Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.417524 4816 scope.go:117] "RemoveContainer" containerID="65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.417644 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.420039 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b8751c6-ef60-400a-b4e3-0042d63c2d83" (UID: "3b8751c6-ef60-400a-b4e3-0042d63c2d83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.435432 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8751c6-ef60-400a-b4e3-0042d63c2d83-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.436374 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzfsx\" (UniqueName: \"kubernetes.io/projected/3b8751c6-ef60-400a-b4e3-0042d63c2d83-kube-api-access-bzfsx\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.436459 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.436914 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.460513 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.4604915800000002 podStartE2EDuration="2.46049158s" podCreationTimestamp="2026-03-11 12:21:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:21:34.437949602 +0000 UTC m=+1381.029213569" watchObservedRunningTime="2026-03-11 12:21:34.46049158 +0000 UTC m=+1381.051755547" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.467237 4816 scope.go:117] "RemoveContainer" containerID="8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.470954 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3b8751c6-ef60-400a-b4e3-0042d63c2d83" (UID: "3b8751c6-ef60-400a-b4e3-0042d63c2d83"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.503582 4816 scope.go:117] "RemoveContainer" containerID="65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a" Mar 11 12:21:34 crc kubenswrapper[4816]: E0311 12:21:34.504161 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a\": container with ID starting with 65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a not found: ID does not exist" containerID="65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.504205 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a"} err="failed to get container status \"65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a\": rpc error: code = NotFound desc = could not find container \"65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a\": container with ID starting with 65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a not found: ID does not exist" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.504232 4816 scope.go:117] "RemoveContainer" containerID="8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097" Mar 11 12:21:34 crc kubenswrapper[4816]: E0311 12:21:34.504588 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097\": container with ID starting with 8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097 not found: ID does not exist" containerID="8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.504654 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097"} err="failed to get container status \"8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097\": rpc error: code = NotFound desc = could not find container \"8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097\": container with ID starting with 8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097 not found: ID does not exist" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.539454 4816 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.760303 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.776804 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.803654 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:21:34 crc kubenswrapper[4816]: E0311 12:21:34.804328 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-metadata" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.804352 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-metadata" Mar 11 12:21:34 crc kubenswrapper[4816]: E0311 12:21:34.804391 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-log" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.804399 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-log" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.804643 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-metadata" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.804674 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-log" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.806026 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.808408 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.810601 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.810879 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.851400 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-config-data\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.851887 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.852016 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb22f\" (UniqueName: \"kubernetes.io/projected/7d73d9d0-5632-47a3-93e0-899f64f51011-kube-api-access-nb22f\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.852183 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d73d9d0-5632-47a3-93e0-899f64f51011-logs\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.852315 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.954280 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d73d9d0-5632-47a3-93e0-899f64f51011-logs\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.954357 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.954419 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-config-data\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.954571 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.954615 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb22f\" (UniqueName: \"kubernetes.io/projected/7d73d9d0-5632-47a3-93e0-899f64f51011-kube-api-access-nb22f\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.957335 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d73d9d0-5632-47a3-93e0-899f64f51011-logs\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.970525 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-config-data\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.970971 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.971972 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.973774 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb22f\" (UniqueName: \"kubernetes.io/projected/7d73d9d0-5632-47a3-93e0-899f64f51011-kube-api-access-nb22f\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:35 crc kubenswrapper[4816]: I0311 12:21:35.127771 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:21:35 crc kubenswrapper[4816]: I0311 12:21:35.610243 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:21:36 crc kubenswrapper[4816]: I0311 12:21:36.147226 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" path="/var/lib/kubelet/pods/3b8751c6-ef60-400a-b4e3-0042d63c2d83/volumes" Mar 11 12:21:36 crc kubenswrapper[4816]: I0311 12:21:36.439595 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d73d9d0-5632-47a3-93e0-899f64f51011","Type":"ContainerStarted","Data":"05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2"} Mar 11 12:21:36 crc kubenswrapper[4816]: I0311 12:21:36.439686 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d73d9d0-5632-47a3-93e0-899f64f51011","Type":"ContainerStarted","Data":"494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191"} Mar 11 12:21:36 crc kubenswrapper[4816]: I0311 12:21:36.439706 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d73d9d0-5632-47a3-93e0-899f64f51011","Type":"ContainerStarted","Data":"c81825bf2b4be781ea36bdb64201016c8530a7353fa6d58d50264ccf72608bde"} Mar 11 12:21:36 crc kubenswrapper[4816]: I0311 12:21:36.479766 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.479738296 podStartE2EDuration="2.479738296s" podCreationTimestamp="2026-03-11 12:21:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:21:36.466752364 +0000 UTC m=+1383.058016331" watchObservedRunningTime="2026-03-11 12:21:36.479738296 +0000 UTC m=+1383.071002263" Mar 11 12:21:37 crc kubenswrapper[4816]: I0311 12:21:37.757197 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 11 12:21:39 crc kubenswrapper[4816]: I0311 12:21:39.515831 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:21:39 crc kubenswrapper[4816]: I0311 12:21:39.516829 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:21:40 crc kubenswrapper[4816]: I0311 12:21:40.128603 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 12:21:40 crc kubenswrapper[4816]: I0311 12:21:40.128769 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.556934 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c9c6p"] Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.559842 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.580723 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9c6p"] Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.635611 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-utilities\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.635975 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-catalog-content\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.636119 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m78hw\" (UniqueName: \"kubernetes.io/projected/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-kube-api-access-m78hw\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.738352 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-catalog-content\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.738419 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m78hw\" (UniqueName: \"kubernetes.io/projected/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-kube-api-access-m78hw\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.738501 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-utilities\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.738920 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-catalog-content\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.739011 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-utilities\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.759889 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m78hw\" (UniqueName: \"kubernetes.io/projected/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-kube-api-access-m78hw\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.795761 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.795839 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.889803 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:42 crc kubenswrapper[4816]: I0311 12:21:42.397485 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9c6p"] Mar 11 12:21:42 crc kubenswrapper[4816]: I0311 12:21:42.530610 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9c6p" event={"ID":"7eab4337-089e-4a7c-b1b2-0d902c26f9bb","Type":"ContainerStarted","Data":"dde17337a3d028447d9ee51ec451117399c6017330c2d2d25cb0f2b2b3ec87e9"} Mar 11 12:21:42 crc kubenswrapper[4816]: I0311 12:21:42.758017 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 11 12:21:42 crc kubenswrapper[4816]: I0311 12:21:42.796349 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 11 12:21:42 crc kubenswrapper[4816]: I0311 12:21:42.817592 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:42 crc kubenswrapper[4816]: I0311 12:21:42.817646 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:43 crc kubenswrapper[4816]: I0311 12:21:43.546212 4816 generic.go:334] "Generic (PLEG): container finished" podID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerID="61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0" exitCode=0 Mar 11 12:21:43 crc kubenswrapper[4816]: I0311 12:21:43.546290 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9c6p" event={"ID":"7eab4337-089e-4a7c-b1b2-0d902c26f9bb","Type":"ContainerDied","Data":"61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0"} Mar 11 12:21:43 crc kubenswrapper[4816]: I0311 12:21:43.549951 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:21:43 crc kubenswrapper[4816]: I0311 12:21:43.589164 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 11 12:21:45 crc kubenswrapper[4816]: I0311 12:21:45.129078 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 12:21:45 crc kubenswrapper[4816]: I0311 12:21:45.129592 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 12:21:45 crc kubenswrapper[4816]: I0311 12:21:45.581356 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9c6p" event={"ID":"7eab4337-089e-4a7c-b1b2-0d902c26f9bb","Type":"ContainerStarted","Data":"351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04"} Mar 11 12:21:46 crc kubenswrapper[4816]: I0311 12:21:46.139472 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:46 crc kubenswrapper[4816]: I0311 12:21:46.139564 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:46 crc kubenswrapper[4816]: I0311 12:21:46.597543 4816 generic.go:334] "Generic (PLEG): container finished" podID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerID="351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04" exitCode=0 Mar 11 12:21:46 crc kubenswrapper[4816]: I0311 12:21:46.597625 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9c6p" event={"ID":"7eab4337-089e-4a7c-b1b2-0d902c26f9bb","Type":"ContainerDied","Data":"351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04"} Mar 11 12:21:47 crc kubenswrapper[4816]: I0311 12:21:47.614166 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9c6p" event={"ID":"7eab4337-089e-4a7c-b1b2-0d902c26f9bb","Type":"ContainerStarted","Data":"07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b"} Mar 11 12:21:47 crc kubenswrapper[4816]: I0311 12:21:47.645178 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c9c6p" podStartSLOduration=3.1857693129999998 podStartE2EDuration="6.645148207s" podCreationTimestamp="2026-03-11 12:21:41 +0000 UTC" firstStartedPulling="2026-03-11 12:21:43.549435199 +0000 UTC m=+1390.140699176" lastFinishedPulling="2026-03-11 12:21:47.008814103 +0000 UTC m=+1393.600078070" observedRunningTime="2026-03-11 12:21:47.640430322 +0000 UTC m=+1394.231694339" watchObservedRunningTime="2026-03-11 12:21:47.645148207 +0000 UTC m=+1394.236412214" Mar 11 12:21:51 crc kubenswrapper[4816]: I0311 12:21:51.808302 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 12:21:51 crc kubenswrapper[4816]: I0311 12:21:51.809558 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 12:21:51 crc kubenswrapper[4816]: I0311 12:21:51.811788 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 12:21:51 crc kubenswrapper[4816]: I0311 12:21:51.820149 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 12:21:51 crc kubenswrapper[4816]: I0311 12:21:51.890214 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:51 crc kubenswrapper[4816]: I0311 12:21:51.890330 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:52 crc kubenswrapper[4816]: I0311 12:21:52.679873 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 12:21:52 crc kubenswrapper[4816]: I0311 12:21:52.687256 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 12:21:52 crc kubenswrapper[4816]: I0311 12:21:52.959241 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c9c6p" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="registry-server" probeResult="failure" output=< Mar 11 12:21:52 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:21:52 crc kubenswrapper[4816]: > Mar 11 12:21:55 crc kubenswrapper[4816]: I0311 12:21:55.135535 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 11 12:21:55 crc kubenswrapper[4816]: I0311 12:21:55.136218 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 11 12:21:55 crc kubenswrapper[4816]: I0311 12:21:55.141591 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 11 12:21:55 crc kubenswrapper[4816]: I0311 12:21:55.142713 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 11 12:21:57 crc kubenswrapper[4816]: I0311 12:21:57.669243 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.163216 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553862-ldg69"] Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.165850 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553862-ldg69" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.169511 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.171091 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.171812 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.183643 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553862-ldg69"] Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.283597 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vv9b\" (UniqueName: \"kubernetes.io/projected/787da494-4b4f-4a96-9e39-45179c456dc0-kube-api-access-8vv9b\") pod \"auto-csr-approver-29553862-ldg69\" (UID: \"787da494-4b4f-4a96-9e39-45179c456dc0\") " pod="openshift-infra/auto-csr-approver-29553862-ldg69" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.387205 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vv9b\" (UniqueName: \"kubernetes.io/projected/787da494-4b4f-4a96-9e39-45179c456dc0-kube-api-access-8vv9b\") pod \"auto-csr-approver-29553862-ldg69\" (UID: \"787da494-4b4f-4a96-9e39-45179c456dc0\") " pod="openshift-infra/auto-csr-approver-29553862-ldg69" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.426283 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vv9b\" (UniqueName: \"kubernetes.io/projected/787da494-4b4f-4a96-9e39-45179c456dc0-kube-api-access-8vv9b\") pod \"auto-csr-approver-29553862-ldg69\" (UID: \"787da494-4b4f-4a96-9e39-45179c456dc0\") " pod="openshift-infra/auto-csr-approver-29553862-ldg69" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.491102 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553862-ldg69" Mar 11 12:22:01 crc kubenswrapper[4816]: W0311 12:22:01.033369 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod787da494_4b4f_4a96_9e39_45179c456dc0.slice/crio-d45894c0b9e08495461a2e93ed4ff33fe9f511a52146f27eba8ae8e2789a5bb8 WatchSource:0}: Error finding container d45894c0b9e08495461a2e93ed4ff33fe9f511a52146f27eba8ae8e2789a5bb8: Status 404 returned error can't find the container with id d45894c0b9e08495461a2e93ed4ff33fe9f511a52146f27eba8ae8e2789a5bb8 Mar 11 12:22:01 crc kubenswrapper[4816]: I0311 12:22:01.045630 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553862-ldg69"] Mar 11 12:22:01 crc kubenswrapper[4816]: I0311 12:22:01.779046 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553862-ldg69" event={"ID":"787da494-4b4f-4a96-9e39-45179c456dc0","Type":"ContainerStarted","Data":"d45894c0b9e08495461a2e93ed4ff33fe9f511a52146f27eba8ae8e2789a5bb8"} Mar 11 12:22:01 crc kubenswrapper[4816]: I0311 12:22:01.960473 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:22:02 crc kubenswrapper[4816]: I0311 12:22:02.028089 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:22:02 crc kubenswrapper[4816]: I0311 12:22:02.205962 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c9c6p"] Mar 11 12:22:02 crc kubenswrapper[4816]: I0311 12:22:02.793379 4816 generic.go:334] "Generic (PLEG): container finished" podID="787da494-4b4f-4a96-9e39-45179c456dc0" containerID="d0874559d26089e67dcd3126789f0cf0dc3ed1323323af96fe7e8ee67fbd532f" exitCode=0 Mar 11 12:22:02 crc kubenswrapper[4816]: I0311 12:22:02.793450 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553862-ldg69" event={"ID":"787da494-4b4f-4a96-9e39-45179c456dc0","Type":"ContainerDied","Data":"d0874559d26089e67dcd3126789f0cf0dc3ed1323323af96fe7e8ee67fbd532f"} Mar 11 12:22:03 crc kubenswrapper[4816]: I0311 12:22:03.804541 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c9c6p" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="registry-server" containerID="cri-o://07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b" gracePeriod=2 Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.293466 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553862-ldg69" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.301421 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.391861 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m78hw\" (UniqueName: \"kubernetes.io/projected/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-kube-api-access-m78hw\") pod \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.392074 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-utilities\") pod \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.392223 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vv9b\" (UniqueName: \"kubernetes.io/projected/787da494-4b4f-4a96-9e39-45179c456dc0-kube-api-access-8vv9b\") pod \"787da494-4b4f-4a96-9e39-45179c456dc0\" (UID: \"787da494-4b4f-4a96-9e39-45179c456dc0\") " Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.393119 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-utilities" (OuterVolumeSpecName: "utilities") pod "7eab4337-089e-4a7c-b1b2-0d902c26f9bb" (UID: "7eab4337-089e-4a7c-b1b2-0d902c26f9bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.393343 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-catalog-content\") pod \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.396421 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.410240 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-kube-api-access-m78hw" (OuterVolumeSpecName: "kube-api-access-m78hw") pod "7eab4337-089e-4a7c-b1b2-0d902c26f9bb" (UID: "7eab4337-089e-4a7c-b1b2-0d902c26f9bb"). InnerVolumeSpecName "kube-api-access-m78hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.419635 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/787da494-4b4f-4a96-9e39-45179c456dc0-kube-api-access-8vv9b" (OuterVolumeSpecName: "kube-api-access-8vv9b") pod "787da494-4b4f-4a96-9e39-45179c456dc0" (UID: "787da494-4b4f-4a96-9e39-45179c456dc0"). InnerVolumeSpecName "kube-api-access-8vv9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.498316 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vv9b\" (UniqueName: \"kubernetes.io/projected/787da494-4b4f-4a96-9e39-45179c456dc0-kube-api-access-8vv9b\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.498353 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m78hw\" (UniqueName: \"kubernetes.io/projected/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-kube-api-access-m78hw\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.534557 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7eab4337-089e-4a7c-b1b2-0d902c26f9bb" (UID: "7eab4337-089e-4a7c-b1b2-0d902c26f9bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.600708 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.822416 4816 generic.go:334] "Generic (PLEG): container finished" podID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerID="07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b" exitCode=0 Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.822525 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9c6p" event={"ID":"7eab4337-089e-4a7c-b1b2-0d902c26f9bb","Type":"ContainerDied","Data":"07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b"} Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.822555 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.822594 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9c6p" event={"ID":"7eab4337-089e-4a7c-b1b2-0d902c26f9bb","Type":"ContainerDied","Data":"dde17337a3d028447d9ee51ec451117399c6017330c2d2d25cb0f2b2b3ec87e9"} Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.822621 4816 scope.go:117] "RemoveContainer" containerID="07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.827993 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553862-ldg69" event={"ID":"787da494-4b4f-4a96-9e39-45179c456dc0","Type":"ContainerDied","Data":"d45894c0b9e08495461a2e93ed4ff33fe9f511a52146f27eba8ae8e2789a5bb8"} Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.828551 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d45894c0b9e08495461a2e93ed4ff33fe9f511a52146f27eba8ae8e2789a5bb8" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.828064 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553862-ldg69" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.866282 4816 scope.go:117] "RemoveContainer" containerID="351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.875816 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c9c6p"] Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.886669 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c9c6p"] Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.905218 4816 scope.go:117] "RemoveContainer" containerID="61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.934495 4816 scope.go:117] "RemoveContainer" containerID="07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b" Mar 11 12:22:04 crc kubenswrapper[4816]: E0311 12:22:04.935458 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b\": container with ID starting with 07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b not found: ID does not exist" containerID="07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.935529 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b"} err="failed to get container status \"07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b\": rpc error: code = NotFound desc = could not find container \"07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b\": container with ID starting with 07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b not found: ID does not exist" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.935567 4816 scope.go:117] "RemoveContainer" containerID="351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04" Mar 11 12:22:04 crc kubenswrapper[4816]: E0311 12:22:04.935869 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04\": container with ID starting with 351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04 not found: ID does not exist" containerID="351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.935890 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04"} err="failed to get container status \"351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04\": rpc error: code = NotFound desc = could not find container \"351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04\": container with ID starting with 351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04 not found: ID does not exist" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.935905 4816 scope.go:117] "RemoveContainer" containerID="61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0" Mar 11 12:22:04 crc kubenswrapper[4816]: E0311 12:22:04.936545 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0\": container with ID starting with 61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0 not found: ID does not exist" containerID="61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.936574 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0"} err="failed to get container status \"61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0\": rpc error: code = NotFound desc = could not find container \"61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0\": container with ID starting with 61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0 not found: ID does not exist" Mar 11 12:22:05 crc kubenswrapper[4816]: I0311 12:22:05.395365 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553856-7k69r"] Mar 11 12:22:05 crc kubenswrapper[4816]: I0311 12:22:05.411501 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553856-7k69r"] Mar 11 12:22:06 crc kubenswrapper[4816]: I0311 12:22:06.140808 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79fb6b17-9d8a-4f10-8a93-a3e65f470a27" path="/var/lib/kubelet/pods/79fb6b17-9d8a-4f10-8a93-a3e65f470a27/volumes" Mar 11 12:22:06 crc kubenswrapper[4816]: I0311 12:22:06.141783 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" path="/var/lib/kubelet/pods/7eab4337-089e-4a7c-b1b2-0d902c26f9bb/volumes" Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.514571 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.515486 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.515552 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.516284 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"20e5352a1f18de3da65279dced0572d988bf4c64c45f769d6d0ae47f9c2cef9a"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.516571 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://20e5352a1f18de3da65279dced0572d988bf4c64c45f769d6d0ae47f9c2cef9a" gracePeriod=600 Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.901577 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="20e5352a1f18de3da65279dced0572d988bf4c64c45f769d6d0ae47f9c2cef9a" exitCode=0 Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.901665 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"20e5352a1f18de3da65279dced0572d988bf4c64c45f769d6d0ae47f9c2cef9a"} Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.902206 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3"} Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.902265 4816 scope.go:117] "RemoveContainer" containerID="92bc406893843c03ac9aa6138b10c838c501d62aa37baf4b9b92254baf796e96" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.856880 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4bcf-account-create-update-nv5hk"] Mar 11 12:22:17 crc kubenswrapper[4816]: E0311 12:22:17.858925 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="extract-utilities" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.858998 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="extract-utilities" Mar 11 12:22:17 crc kubenswrapper[4816]: E0311 12:22:17.859068 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="787da494-4b4f-4a96-9e39-45179c456dc0" containerName="oc" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.859126 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="787da494-4b4f-4a96-9e39-45179c456dc0" containerName="oc" Mar 11 12:22:17 crc kubenswrapper[4816]: E0311 12:22:17.859187 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="extract-content" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.859234 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="extract-content" Mar 11 12:22:17 crc kubenswrapper[4816]: E0311 12:22:17.859365 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="registry-server" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.859426 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="registry-server" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.859696 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="787da494-4b4f-4a96-9e39-45179c456dc0" containerName="oc" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.859766 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="registry-server" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.860549 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.873861 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.901651 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4bcf-account-create-update-nv5hk"] Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.945292 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4bcf-account-create-update-gkcsc"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:17.997975 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4bcf-account-create-update-gkcsc"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.021980 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grm8x\" (UniqueName: \"kubernetes.io/projected/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-kube-api-access-grm8x\") pod \"cinder-4bcf-account-create-update-nv5hk\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.022049 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-operator-scripts\") pod \"cinder-4bcf-account-create-update-nv5hk\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.092890 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.093162 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="502b3843-8246-4715-9735-dfc0336caacb" containerName="openstackclient" containerID="cri-o://fd6533a10f6d22b4d1d7a2a73ad8cc4591438b77aefeced48dbf3b4526cf28f0" gracePeriod=2 Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.137117 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grm8x\" (UniqueName: \"kubernetes.io/projected/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-kube-api-access-grm8x\") pod \"cinder-4bcf-account-create-update-nv5hk\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.137497 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-operator-scripts\") pod \"cinder-4bcf-account-create-update-nv5hk\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.138429 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-operator-scripts\") pod \"cinder-4bcf-account-create-update-nv5hk\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.179108 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16" path="/var/lib/kubelet/pods/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16/volumes" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.180289 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.180341 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-snf5b"] Mar 11 12:22:18 crc kubenswrapper[4816]: E0311 12:22:18.180753 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="502b3843-8246-4715-9735-dfc0336caacb" containerName="openstackclient" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.180781 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="502b3843-8246-4715-9735-dfc0336caacb" containerName="openstackclient" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.181532 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="502b3843-8246-4715-9735-dfc0336caacb" containerName="openstackclient" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.182331 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.195004 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grm8x\" (UniqueName: \"kubernetes.io/projected/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-kube-api-access-grm8x\") pod \"cinder-4bcf-account-create-update-nv5hk\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.205564 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-963f-account-create-update-9hnkv"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.209014 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.215844 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.221876 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.235504 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:18 crc kubenswrapper[4816]: E0311 12:22:18.244589 4816 configmap.go:193] Couldn't get configMap openstack/ovncontroller-metrics-config: configmap "ovncontroller-metrics-config" not found Mar 11 12:22:18 crc kubenswrapper[4816]: E0311 12:22:18.244666 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config podName:91cdfd54-2ee7-490e-bf3f-563406e59cda nodeName:}" failed. No retries permitted until 2026-03-11 12:22:18.744645431 +0000 UTC m=+1425.335909398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config") pod "ovn-controller-metrics-r8xbm" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda") : configmap "ovncontroller-metrics-config" not found Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.266977 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-963f-account-create-update-w2lrf"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.347322 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn92x\" (UniqueName: \"kubernetes.io/projected/5e637fcd-e45c-479c-856d-086d642af3bb-kube-api-access-hn92x\") pod \"neutron-963f-account-create-update-9hnkv\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.347394 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf9zw\" (UniqueName: \"kubernetes.io/projected/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-kube-api-access-vf9zw\") pod \"root-account-create-update-snf5b\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.347451 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts\") pod \"root-account-create-update-snf5b\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.347486 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e637fcd-e45c-479c-856d-086d642af3bb-operator-scripts\") pod \"neutron-963f-account-create-update-9hnkv\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.360152 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-963f-account-create-update-w2lrf"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.425381 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-snf5b"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.451002 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn92x\" (UniqueName: \"kubernetes.io/projected/5e637fcd-e45c-479c-856d-086d642af3bb-kube-api-access-hn92x\") pod \"neutron-963f-account-create-update-9hnkv\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.451098 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf9zw\" (UniqueName: \"kubernetes.io/projected/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-kube-api-access-vf9zw\") pod \"root-account-create-update-snf5b\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.451161 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts\") pod \"root-account-create-update-snf5b\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.451218 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e637fcd-e45c-479c-856d-086d642af3bb-operator-scripts\") pod \"neutron-963f-account-create-update-9hnkv\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.451976 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e637fcd-e45c-479c-856d-086d642af3bb-operator-scripts\") pod \"neutron-963f-account-create-update-9hnkv\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.452494 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts\") pod \"root-account-create-update-snf5b\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.460186 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-963f-account-create-update-9hnkv"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.502441 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4a8d-account-create-update-2lrkx"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.504154 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.545149 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.572990 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn92x\" (UniqueName: \"kubernetes.io/projected/5e637fcd-e45c-479c-856d-086d642af3bb-kube-api-access-hn92x\") pod \"neutron-963f-account-create-update-9hnkv\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.588234 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf9zw\" (UniqueName: \"kubernetes.io/projected/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-kube-api-access-vf9zw\") pod \"root-account-create-update-snf5b\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.593211 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4a8d-account-create-update-2lrkx"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.652767 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4a8d-account-create-update-gxhxz"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.655929 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-operator-scripts\") pod \"barbican-4a8d-account-create-update-2lrkx\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.656102 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8msfc\" (UniqueName: \"kubernetes.io/projected/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-kube-api-access-8msfc\") pod \"barbican-4a8d-account-create-update-2lrkx\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.699936 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4a8d-account-create-update-gxhxz"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.700481 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.709329 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.720033 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.743608 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.743905 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="ovn-northd" containerID="cri-o://8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" gracePeriod=30 Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.746355 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="openstack-network-exporter" containerID="cri-o://6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7" gracePeriod=30 Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.757907 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-operator-scripts\") pod \"barbican-4a8d-account-create-update-2lrkx\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.758053 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8msfc\" (UniqueName: \"kubernetes.io/projected/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-kube-api-access-8msfc\") pod \"barbican-4a8d-account-create-update-2lrkx\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.759217 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-operator-scripts\") pod \"barbican-4a8d-account-create-update-2lrkx\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:18 crc kubenswrapper[4816]: E0311 12:22:18.759293 4816 configmap.go:193] Couldn't get configMap openstack/ovncontroller-metrics-config: configmap "ovncontroller-metrics-config" not found Mar 11 12:22:18 crc kubenswrapper[4816]: E0311 12:22:18.759336 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config podName:91cdfd54-2ee7-490e-bf3f-563406e59cda nodeName:}" failed. No retries permitted until 2026-03-11 12:22:19.759322972 +0000 UTC m=+1426.350586939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config") pod "ovn-controller-metrics-r8xbm" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda") : configmap "ovncontroller-metrics-config" not found Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.781310 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7l6hp"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.795376 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7l6hp"] Mar 11 12:22:18 crc kubenswrapper[4816]: E0311 12:22:18.883731 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 12:22:18 crc kubenswrapper[4816]: E0311 12:22:18.883835 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data podName:26aea2df-f497-478d-b953-060189ef2569 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:19.383803276 +0000 UTC m=+1425.975067243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data") pod "rabbitmq-server-0" (UID: "26aea2df-f497-478d-b953-060189ef2569") : configmap "rabbitmq-config-data" not found Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.006379 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8msfc\" (UniqueName: \"kubernetes.io/projected/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-kube-api-access-8msfc\") pod \"barbican-4a8d-account-create-update-2lrkx\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.012336 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-fjmnw"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.153355 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-fjmnw"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.178595 4816 generic.go:334] "Generic (PLEG): container finished" podID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerID="6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7" exitCode=2 Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.178657 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825","Type":"ContainerDied","Data":"6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7"} Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.217361 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c8d3-account-create-update-85zqd"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.240866 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c8d3-account-create-update-85zqd"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.252441 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.298996 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-84rn8"] Mar 11 12:22:19 crc kubenswrapper[4816]: E0311 12:22:19.417684 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 12:22:19 crc kubenswrapper[4816]: E0311 12:22:19.417765 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data podName:26aea2df-f497-478d-b953-060189ef2569 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:20.41774754 +0000 UTC m=+1427.009011507 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data") pod "rabbitmq-server-0" (UID: "26aea2df-f497-478d-b953-060189ef2569") : configmap "rabbitmq-config-data" not found Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.424174 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tnhfq"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.490382 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-r8xbm"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.491222 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-r8xbm" podUID="91cdfd54-2ee7-490e-bf3f-563406e59cda" containerName="openstack-network-exporter" containerID="cri-o://be5c0e05e1987846058e7b0cb0a3139e1568599a10f5067e16f3de74b6995fb8" gracePeriod=30 Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.599870 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-4b4ms"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.709899 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-91ce-account-create-update-n8mz8"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.758521 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-4b4ms"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.775981 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-91ce-account-create-update-n8mz8"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.788931 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-tdv64"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.804287 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-tdv64"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.821162 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-53ba-account-create-update-2vf2k"] Mar 11 12:22:19 crc kubenswrapper[4816]: E0311 12:22:19.843476 4816 configmap.go:193] Couldn't get configMap openstack/ovncontroller-metrics-config: configmap "ovncontroller-metrics-config" not found Mar 11 12:22:19 crc kubenswrapper[4816]: E0311 12:22:19.843560 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config podName:91cdfd54-2ee7-490e-bf3f-563406e59cda nodeName:}" failed. No retries permitted until 2026-03-11 12:22:21.843539828 +0000 UTC m=+1428.434803795 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config") pod "ovn-controller-metrics-r8xbm" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda") : configmap "ovncontroller-metrics-config" not found Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.927581 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-53ba-account-create-update-2vf2k"] Mar 11 12:22:20 crc kubenswrapper[4816]: W0311 12:22:20.050824 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1dd25da_51d6_45f0_b70c_f1baa17d2da3.slice/crio-1d267594122bbe4cf05c9b26645399ea847bc2099ad10ee8bb693c8e2675f8e5 WatchSource:0}: Error finding container 1d267594122bbe4cf05c9b26645399ea847bc2099ad10ee8bb693c8e2675f8e5: Status 404 returned error can't find the container with id 1d267594122bbe4cf05c9b26645399ea847bc2099ad10ee8bb693c8e2675f8e5 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.067817 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-rjxsf"] Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.093080 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:20 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: if [ -n "cinder" ]; then Mar 11 12:22:20 crc kubenswrapper[4816]: GRANT_DATABASE="cinder" Mar 11 12:22:20 crc kubenswrapper[4816]: else Mar 11 12:22:20 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:20 crc kubenswrapper[4816]: fi Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:20 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:20 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:20 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:20 crc kubenswrapper[4816]: # support updates Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.096389 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-4bcf-account-create-update-nv5hk" podUID="b1dd25da-51d6-45f0-b70c-f1baa17d2da3" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.100636 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-rjxsf"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.132365 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.133214 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="openstack-network-exporter" containerID="cri-o://4c01622c11d3f3812a2eae31ec2decc063cf1fe9d275e29cfb942cdc480ba8db" gracePeriod=300 Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.285066 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:20 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: if [ -n "cinder" ]; then Mar 11 12:22:20 crc kubenswrapper[4816]: GRANT_DATABASE="cinder" Mar 11 12:22:20 crc kubenswrapper[4816]: else Mar 11 12:22:20 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:20 crc kubenswrapper[4816]: fi Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:20 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:20 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:20 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:20 crc kubenswrapper[4816]: # support updates Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.285515 4816 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-84rn8" message=< Mar 11 12:22:20 crc kubenswrapper[4816]: Exiting ovn-controller (1) [ OK ] Mar 11 12:22:20 crc kubenswrapper[4816]: > Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.285563 4816 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-84rn8" podUID="2de58390-335b-40cc-8461-d931d3b22e41" containerName="ovn-controller" containerID="cri-o://b67798b7f6eede8770ea6cbb3808f928e4bdbe9cdbf08abe0db324318159dd17" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.285616 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-84rn8" podUID="2de58390-335b-40cc-8461-d931d3b22e41" containerName="ovn-controller" containerID="cri-o://b67798b7f6eede8770ea6cbb3808f928e4bdbe9cdbf08abe0db324318159dd17" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.286517 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-4bcf-account-create-update-nv5hk" podUID="b1dd25da-51d6-45f0-b70c-f1baa17d2da3" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.303746 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-r8xbm_91cdfd54-2ee7-490e-bf3f-563406e59cda/openstack-network-exporter/0.log" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.303809 4816 generic.go:334] "Generic (PLEG): container finished" podID="91cdfd54-2ee7-490e-bf3f-563406e59cda" containerID="be5c0e05e1987846058e7b0cb0a3139e1568599a10f5067e16f3de74b6995fb8" exitCode=2 Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.474838 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.474928 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data podName:26aea2df-f497-478d-b953-060189ef2569 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:22.474902089 +0000 UTC m=+1429.066166056 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data") pod "rabbitmq-server-0" (UID: "26aea2df-f497-478d-b953-060189ef2569") : configmap "rabbitmq-config-data" not found Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.521624 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="ovsdbserver-sb" containerID="cri-o://ee8f2b910a2d52b32d76649fbccb57d3440b0a1d624504112ddbe71af6ca7889" gracePeriod=300 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.615312 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2772ef82-fe14-4f4d-8349-8ee515e39979" path="/var/lib/kubelet/pods/2772ef82-fe14-4f4d-8349-8ee515e39979/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.616559 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a1317c-41a6-4589-949b-e422d7fe8837" path="/var/lib/kubelet/pods/27a1317c-41a6-4589-949b-e422d7fe8837/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.650350 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="288dd774-6e04-45d2-b786-c7f2be7fbeae" path="/var/lib/kubelet/pods/288dd774-6e04-45d2-b786-c7f2be7fbeae/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.654523 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35fe8af0-2f02-4d81-ae03-9d399900494c" path="/var/lib/kubelet/pods/35fe8af0-2f02-4d81-ae03-9d399900494c/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.655692 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae20611-891b-49ee-b5b8-0dad8af80906" path="/var/lib/kubelet/pods/3ae20611-891b-49ee-b5b8-0dad8af80906/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.670856 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c643aa04-ce8d-4c3b-befc-ecdf63e35de8" path="/var/lib/kubelet/pods/c643aa04-ce8d-4c3b-befc-ecdf63e35de8/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.673918 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e82cb42a-5dbf-43d1-a71c-18b3e6d252d6" path="/var/lib/kubelet/pods/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.674753 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee11077d-39aa-44c4-9cf3-a8a80647bc50" path="/var/lib/kubelet/pods/ee11077d-39aa-44c4-9cf3-a8a80647bc50/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.675470 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f92c8acc-1a4a-4f28-a123-2f5b8b6905af" path="/var/lib/kubelet/pods/f92c8acc-1a4a-4f28-a123-2f5b8b6905af/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.688540 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee1eb20-6fbe-4e59-a434-54c2e8a6165d" path="/var/lib/kubelet/pods/fee1eb20-6fbe-4e59-a434-54c2e8a6165d/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689327 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-9nggr"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689358 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4bcf-account-create-update-nv5hk" event={"ID":"b1dd25da-51d6-45f0-b70c-f1baa17d2da3","Type":"ContainerStarted","Data":"1d267594122bbe4cf05c9b26645399ea847bc2099ad10ee8bb693c8e2675f8e5"} Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689825 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-r8xbm" event={"ID":"91cdfd54-2ee7-490e-bf3f-563406e59cda","Type":"ContainerDied","Data":"be5c0e05e1987846058e7b0cb0a3139e1568599a10f5067e16f3de74b6995fb8"} Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689848 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-9nggr"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689929 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689952 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689967 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-377b-account-create-update-gb4b2"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689978 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-n98v5"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689990 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-n98v5"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690001 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-7h7r8"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690017 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-377b-account-create-update-gb4b2"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690029 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6867c6dbc5-lzgfd"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690046 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690059 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5ffd6fb588-7hftz"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690105 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690120 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-wsfdf"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690130 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-wsfdf"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690140 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-xm9d9"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690149 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-xm9d9"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690160 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4bcf-account-create-update-nv5hk"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690170 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-qt9tz"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690180 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-qt9tz"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690204 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6c5b6658f-tdgsh"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690468 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6c5b6658f-tdgsh" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-httpd" containerID="cri-o://ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690924 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="cinder-scheduler" containerID="cri-o://b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.691114 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" podUID="32a279c7-00a8-4e98-8356-91e219416a22" containerName="dnsmasq-dns" containerID="cri-o://373cac1249bba137b237fe973a3b7880bfcca6318c8db162f6ca4526fa918835" gracePeriod=10 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.692315 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api-log" containerID="cri-o://691c4f9d45de04f6bb32f82d9d22154b130edce7e7b8b75479f100df834dbbad" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.692469 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5ffd6fb588-7hftz" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-log" containerID="cri-o://3acd68e155620ecc4260fb5ba2dfe8af8d211b5066fc4c67c7f8658e47beb43f" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.692879 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-server" containerID="cri-o://e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693285 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6c5b6658f-tdgsh" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-server" containerID="cri-o://526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693358 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="openstack-network-exporter" containerID="cri-o://5e227ce28f5de77017097c97e0a28037dfd14090da88c0fa20d1f53e10f8268b" gracePeriod=300 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693415 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="probe" containerID="cri-o://4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693510 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-expirer" containerID="cri-o://cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693575 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-auditor" containerID="cri-o://a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693641 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-updater" containerID="cri-o://1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693656 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-auditor" containerID="cri-o://25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693669 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-replicator" containerID="cri-o://3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693682 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-server" containerID="cri-o://fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693694 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-updater" containerID="cri-o://bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693710 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-reaper" containerID="cri-o://712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693723 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-replicator" containerID="cri-o://424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693737 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-server" containerID="cri-o://9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693800 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6867c6dbc5-lzgfd" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-httpd" containerID="cri-o://d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693812 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6867c6dbc5-lzgfd" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-api" containerID="cri-o://fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693833 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api" containerID="cri-o://c04dc0a2663851eac8a9c1faccfd79cf6c27fbce470c4ad0b7499358caea8a06" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693854 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="swift-recon-cron" containerID="cri-o://4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693887 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5ffd6fb588-7hftz" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-api" containerID="cri-o://6309388e250c5434fd6b39ddce96cacd594c9880dd57d2c9e89074cac30a961b" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693907 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-replicator" containerID="cri-o://d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693934 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-auditor" containerID="cri-o://acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.694208 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="rsync" containerID="cri-o://68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.702450 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:20 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: if [ -n "" ]; then Mar 11 12:22:20 crc kubenswrapper[4816]: GRANT_DATABASE="" Mar 11 12:22:20 crc kubenswrapper[4816]: else Mar 11 12:22:20 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:20 crc kubenswrapper[4816]: fi Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:20 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:20 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:20 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:20 crc kubenswrapper[4816]: # support updates Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.704540 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-snf5b" podUID="3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.709220 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-l5lds"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.739240 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-l5lds"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.749694 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.750016 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-log" containerID="cri-o://c020c8caff09b112c5e61167611361a425a1b4a92367fbbd7dbf97390e021cca" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.750231 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-httpd" containerID="cri-o://8ba3c9d212f5a9f10887e454eabe42340558258c07c8285eb982b69803aa3749" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.787508 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.796266 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-855897fd55-t7sfb"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.796617 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-855897fd55-t7sfb" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker-log" containerID="cri-o://f675def681ebf7bc955ad7437f5bae6532f22f4db4a832aa48a182650e749af2" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.797174 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-855897fd55-t7sfb" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker" containerID="cri-o://4e741a528a024acf7a27b5a7253bef28cff4a22ea41c625ba24158e8c7be76eb" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.802735 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" containerID="cri-o://9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" gracePeriod=29 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.806986 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.807372 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-log" containerID="cri-o://c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.807959 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-httpd" containerID="cri-o://9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.824043 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.824515 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-log" containerID="cri-o://494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.824977 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-metadata" containerID="cri-o://05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.845951 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-64b59f8d4-2vxd9"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.846330 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-64b59f8d4-2vxd9" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api-log" containerID="cri-o://8dc2306ac32e5d795143d562064f5d8e129c4815490ca1bada6d8509ddcc5240" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.846841 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-64b59f8d4-2vxd9" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api" containerID="cri-o://5e19f1840cfd8f7623e64404579f814579ee6602ca765f964613a90342b26cc2" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.857488 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="ovsdbserver-nb" containerID="cri-o://e36d52352569b57940dd2cebcd565fb31e6c049d444d2da7c54f0fe9d882c7f6" gracePeriod=300 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.881122 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4bcf-account-create-update-nv5hk"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.911887 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-59b4f4d478-5b797"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.912222 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener-log" containerID="cri-o://5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.912794 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener" containerID="cri-o://a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.925521 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.933365 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-log" containerID="cri-o://9fc317ca9311d71a32a61a06236255eddc3a32782036027513c2583e902eb2de" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.934292 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-api" containerID="cri-o://f7560f8d6f98f14204afbbce69a7ff86d5f07a2d1a84e68d20701b7c7e5ce84d" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.949267 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-cnlpc"] Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.969579 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.969651 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data podName:3779c0f5-9084-4c07-83d9-fe2017559f7b nodeName:}" failed. No retries permitted until 2026-03-11 12:22:21.469631796 +0000 UTC m=+1428.060895753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b") : configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.987819 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-cnlpc"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.017738 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4a8d-account-create-update-2lrkx"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.035369 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-4z7mr"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.083678 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-zv62x"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.102318 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-4z7mr"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.134370 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-zv62x"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.176613 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" containerID="cri-o://e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" gracePeriod=29 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.176706 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-txccq"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.199636 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.212872 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-r8xbm_91cdfd54-2ee7-490e-bf3f-563406e59cda/openstack-network-exporter/0.log" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.212964 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.219629 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-txccq"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.240239 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.266559 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.266825 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="fd796be0-d1ac-47be-8162-3b1c42febc0a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e" gracePeriod=30 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.287652 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-metrics-certs-tls-certs\") pod \"91cdfd54-2ee7-490e-bf3f-563406e59cda\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.287707 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovn-rundir\") pod \"91cdfd54-2ee7-490e-bf3f-563406e59cda\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.287787 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-combined-ca-bundle\") pod \"91cdfd54-2ee7-490e-bf3f-563406e59cda\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.287855 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lq9x\" (UniqueName: \"kubernetes.io/projected/91cdfd54-2ee7-490e-bf3f-563406e59cda-kube-api-access-7lq9x\") pod \"91cdfd54-2ee7-490e-bf3f-563406e59cda\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.287918 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config\") pod \"91cdfd54-2ee7-490e-bf3f-563406e59cda\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.288106 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovs-rundir\") pod \"91cdfd54-2ee7-490e-bf3f-563406e59cda\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.289421 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "91cdfd54-2ee7-490e-bf3f-563406e59cda" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.290037 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config" (OuterVolumeSpecName: "config") pod "91cdfd54-2ee7-490e-bf3f-563406e59cda" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.290806 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "91cdfd54-2ee7-490e-bf3f-563406e59cda" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.293997 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.294021 4816 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.294034 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.319807 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4kpfn"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.336089 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-963f-account-create-update-9hnkv"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.351984 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4kpfn"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.364877 4816 generic.go:334] "Generic (PLEG): container finished" podID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerID="c020c8caff09b112c5e61167611361a425a1b4a92367fbbd7dbf97390e021cca" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.365059 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7457f2db-7979-4d92-bd90-a1464b8a3878","Type":"ContainerDied","Data":"c020c8caff09b112c5e61167611361a425a1b4a92367fbbd7dbf97390e021cca"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.368774 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-85nd9"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.373875 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91cdfd54-2ee7-490e-bf3f-563406e59cda-kube-api-access-7lq9x" (OuterVolumeSpecName: "kube-api-access-7lq9x") pod "91cdfd54-2ee7-490e-bf3f-563406e59cda" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda"). InnerVolumeSpecName "kube-api-access-7lq9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.381574 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3c3c-account-create-update-2whdq"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.388918 4816 generic.go:334] "Generic (PLEG): container finished" podID="7795071e-2de0-43cb-b225-cfed54570d94" containerID="8dc2306ac32e5d795143d562064f5d8e129c4815490ca1bada6d8509ddcc5240" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.389039 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b59f8d4-2vxd9" event={"ID":"7795071e-2de0-43cb-b225-cfed54570d94","Type":"ContainerDied","Data":"8dc2306ac32e5d795143d562064f5d8e129c4815490ca1bada6d8509ddcc5240"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.390489 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3c3c-account-create-update-2whdq"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.396776 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lq9x\" (UniqueName: \"kubernetes.io/projected/91cdfd54-2ee7-490e-bf3f-563406e59cda-kube-api-access-7lq9x\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.398872 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-85nd9"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.400448 4816 generic.go:334] "Generic (PLEG): container finished" podID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerID="691c4f9d45de04f6bb32f82d9d22154b130edce7e7b8b75479f100df834dbbad" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.400542 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c94c19c-3ccb-43cc-ab41-92baa3141f73","Type":"ContainerDied","Data":"691c4f9d45de04f6bb32f82d9d22154b130edce7e7b8b75479f100df834dbbad"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.406438 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91cdfd54-2ee7-490e-bf3f-563406e59cda" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.406716 4816 generic.go:334] "Generic (PLEG): container finished" podID="32a279c7-00a8-4e98-8356-91e219416a22" containerID="373cac1249bba137b237fe973a3b7880bfcca6318c8db162f6ca4526fa918835" exitCode=0 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.406910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" event={"ID":"32a279c7-00a8-4e98-8356-91e219416a22","Type":"ContainerDied","Data":"373cac1249bba137b237fe973a3b7880bfcca6318c8db162f6ca4526fa918835"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.428435 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="26aea2df-f497-478d-b953-060189ef2569" containerName="rabbitmq" containerID="cri-o://0735cf7e4268f5297289dcfc433ce805028b2098230211ba63ceb121fac25ec7" gracePeriod=604800 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.428882 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-snf5b"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.450805 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e16e7d30-3235-44f2-81b4-c0c828071bbb/ovsdbserver-nb/0.log" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.450884 4816 generic.go:334] "Generic (PLEG): container finished" podID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerID="5e227ce28f5de77017097c97e0a28037dfd14090da88c0fa20d1f53e10f8268b" exitCode=2 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.450918 4816 generic.go:334] "Generic (PLEG): container finished" podID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerID="e36d52352569b57940dd2cebcd565fb31e6c049d444d2da7c54f0fe9d882c7f6" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.451049 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16e7d30-3235-44f2-81b4-c0c828071bbb","Type":"ContainerDied","Data":"5e227ce28f5de77017097c97e0a28037dfd14090da88c0fa20d1f53e10f8268b"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.451092 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16e7d30-3235-44f2-81b4-c0c828071bbb","Type":"ContainerDied","Data":"e36d52352569b57940dd2cebcd565fb31e6c049d444d2da7c54f0fe9d882c7f6"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.465334 4816 generic.go:334] "Generic (PLEG): container finished" podID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerID="c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.465486 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e95ddca0-76d0-4dce-9983-4b07655adc25","Type":"ContainerDied","Data":"c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.538445 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.538609 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.538677 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data podName:3779c0f5-9084-4c07-83d9-fe2017559f7b nodeName:}" failed. No retries permitted until 2026-03-11 12:22:22.538656857 +0000 UTC m=+1429.129920824 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b") : configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.634281 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-snf5b" event={"ID":"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb","Type":"ContainerStarted","Data":"37dc46cbca9b814e026266eb10b0888ee0b98d2b5a77de8a934c3e1d5742969a"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.652600 4816 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-snf5b" secret="" err="secret \"galera-openstack-cell1-dockercfg-n5gxr\" not found" Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.654392 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:21 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: if [ -n "neutron" ]; then Mar 11 12:22:21 crc kubenswrapper[4816]: GRANT_DATABASE="neutron" Mar 11 12:22:21 crc kubenswrapper[4816]: else Mar 11 12:22:21 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:21 crc kubenswrapper[4816]: fi Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:21 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:21 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:21 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:21 crc kubenswrapper[4816]: # support updates Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.656346 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-963f-account-create-update-9hnkv" podUID="5e637fcd-e45c-479c-856d-086d642af3bb" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.663301 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-snf5b"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.695026 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdblc"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.706269 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.707175 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" event={"ID":"ddd535a1-7585-4cb7-94ec-f4b98b10be4a","Type":"ContainerDied","Data":"5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85"} Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.706855 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:21 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: if [ -n "" ]; then Mar 11 12:22:21 crc kubenswrapper[4816]: GRANT_DATABASE="" Mar 11 12:22:21 crc kubenswrapper[4816]: else Mar 11 12:22:21 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:21 crc kubenswrapper[4816]: fi Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:21 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:21 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:21 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:21 crc kubenswrapper[4816]: # support updates Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.706663 4816 generic.go:334] "Generic (PLEG): container finished" podID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerID="5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.707576 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="63567eba-cc2a-4168-9e81-51c1daed5482" containerName="nova-cell1-conductor-conductor" containerID="cri-o://adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6" gracePeriod=30 Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.708321 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-snf5b" podUID="3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.729338 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdblc"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.730167 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "91cdfd54-2ee7-490e-bf3f-563406e59cda" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.737099 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-r8xbm_91cdfd54-2ee7-490e-bf3f-563406e59cda/openstack-network-exporter/0.log" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.746585 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r2t5s"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.754238 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-r8xbm" event={"ID":"91cdfd54-2ee7-490e-bf3f-563406e59cda","Type":"ContainerDied","Data":"f248cb3d03b08e499e3214d91a64d19cf5108c7a76d1c30f73bf2b55bdc66e0a"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.754328 4816 scope.go:117] "RemoveContainer" containerID="be5c0e05e1987846058e7b0cb0a3139e1568599a10f5067e16f3de74b6995fb8" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.754553 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.759643 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.759782 4816 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.761498 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:21 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: if [ -n "barbican" ]; then Mar 11 12:22:21 crc kubenswrapper[4816]: GRANT_DATABASE="barbican" Mar 11 12:22:21 crc kubenswrapper[4816]: else Mar 11 12:22:21 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:21 crc kubenswrapper[4816]: fi Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:21 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:21 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:21 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:21 crc kubenswrapper[4816]: # support updates Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.762821 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-4a8d-account-create-update-2lrkx" podUID="c47c9b57-0735-415f-a1a1-4b3096e3fbcf" Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.763016 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts podName:3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb nodeName:}" failed. No retries permitted until 2026-03-11 12:22:22.262983078 +0000 UTC m=+1428.854247055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts") pod "root-account-create-update-snf5b" (UID: "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb") : configmap "openstack-cell1-scripts" not found Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.763497 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.765972 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" containerName="nova-cell0-conductor-conductor" containerID="cri-o://4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2" gracePeriod=30 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.776335 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r2t5s"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.788496 4816 generic.go:334] "Generic (PLEG): container finished" podID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerID="494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.788658 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d73d9d0-5632-47a3-93e0-899f64f51011","Type":"ContainerDied","Data":"494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.804644 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.815606 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.815848 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="41f4b502-b85f-488c-b55b-27a31479df68" containerName="nova-scheduler-scheduler" containerID="cri-o://60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" gracePeriod=30 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.819355 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-963f-account-create-update-9hnkv"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.831494 4816 generic.go:334] "Generic (PLEG): container finished" podID="2de58390-335b-40cc-8461-d931d3b22e41" containerID="b67798b7f6eede8770ea6cbb3808f928e4bdbe9cdbf08abe0db324318159dd17" exitCode=0 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.831584 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8" event={"ID":"2de58390-335b-40cc-8461-d931d3b22e41","Type":"ContainerDied","Data":"b67798b7f6eede8770ea6cbb3808f928e4bdbe9cdbf08abe0db324318159dd17"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.834567 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4a8d-account-create-update-2lrkx"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.862005 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fe419fb1-1901-4fd4-9d9c-8884651e3ad9/ovsdbserver-sb/0.log" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.862065 4816 generic.go:334] "Generic (PLEG): container finished" podID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerID="4c01622c11d3f3812a2eae31ec2decc063cf1fe9d275e29cfb942cdc480ba8db" exitCode=2 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.862091 4816 generic.go:334] "Generic (PLEG): container finished" podID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerID="ee8f2b910a2d52b32d76649fbccb57d3440b0a1d624504112ddbe71af6ca7889" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.862447 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe419fb1-1901-4fd4-9d9c-8884651e3ad9","Type":"ContainerDied","Data":"4c01622c11d3f3812a2eae31ec2decc063cf1fe9d275e29cfb942cdc480ba8db"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.862548 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe419fb1-1901-4fd4-9d9c-8884651e3ad9","Type":"ContainerDied","Data":"ee8f2b910a2d52b32d76649fbccb57d3440b0a1d624504112ddbe71af6ca7889"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.876969 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerName="galera" containerID="cri-o://08358819a244a822957b7c7153f37ef3fa2c0371fe913be221e0cf6e09e89054" gracePeriod=30 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.878321 4816 generic.go:334] "Generic (PLEG): container finished" podID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerID="f675def681ebf7bc955ad7437f5bae6532f22f4db4a832aa48a182650e749af2" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.878491 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-855897fd55-t7sfb" event={"ID":"b79e89c6-5f56-4439-ad63-a86259d4ed29","Type":"ContainerDied","Data":"f675def681ebf7bc955ad7437f5bae6532f22f4db4a832aa48a182650e749af2"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.901860 4816 generic.go:334] "Generic (PLEG): container finished" podID="502b3843-8246-4715-9735-dfc0336caacb" containerID="fd6533a10f6d22b4d1d7a2a73ad8cc4591438b77aefeced48dbf3b4526cf28f0" exitCode=137 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.905576 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.910201 4816 generic.go:334] "Generic (PLEG): container finished" podID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerID="3acd68e155620ecc4260fb5ba2dfe8af8d211b5066fc4c67c7f8658e47beb43f" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.910332 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6fb588-7hftz" event={"ID":"7bd939d8-3b22-4496-acea-ac527f3e5149","Type":"ContainerDied","Data":"3acd68e155620ecc4260fb5ba2dfe8af8d211b5066fc4c67c7f8658e47beb43f"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.925836 4816 generic.go:334] "Generic (PLEG): container finished" podID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerID="d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d" exitCode=0 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.925946 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6867c6dbc5-lzgfd" event={"ID":"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46","Type":"ContainerDied","Data":"d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.938943 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.949402 4816 generic.go:334] "Generic (PLEG): container finished" podID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerID="ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff" exitCode=0 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.949478 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-r8xbm"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.949507 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c5b6658f-tdgsh" event={"ID":"3e6d90d2-e7e3-4245-b3a6-042621e01a67","Type":"ContainerDied","Data":"ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.950391 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fe419fb1-1901-4fd4-9d9c-8884651e3ad9/ovsdbserver-sb/0.log" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.950452 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.965260 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-log-ovn\") pod \"2de58390-335b-40cc-8461-d931d3b22e41\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.965339 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run\") pod \"2de58390-335b-40cc-8461-d931d3b22e41\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.965372 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-ovn-controller-tls-certs\") pod \"2de58390-335b-40cc-8461-d931d3b22e41\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.965433 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpfnw\" (UniqueName: \"kubernetes.io/projected/2de58390-335b-40cc-8461-d931d3b22e41-kube-api-access-bpfnw\") pod \"2de58390-335b-40cc-8461-d931d3b22e41\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.965552 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run-ovn\") pod \"2de58390-335b-40cc-8461-d931d3b22e41\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.965692 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2de58390-335b-40cc-8461-d931d3b22e41-scripts\") pod \"2de58390-335b-40cc-8461-d931d3b22e41\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.965735 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-combined-ca-bundle\") pod \"2de58390-335b-40cc-8461-d931d3b22e41\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.966369 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "2de58390-335b-40cc-8461-d931d3b22e41" (UID: "2de58390-335b-40cc-8461-d931d3b22e41"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.968539 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "2de58390-335b-40cc-8461-d931d3b22e41" (UID: "2de58390-335b-40cc-8461-d931d3b22e41"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.968572 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.968715 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run" (OuterVolumeSpecName: "var-run") pod "2de58390-335b-40cc-8461-d931d3b22e41" (UID: "2de58390-335b-40cc-8461-d931d3b22e41"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.969379 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de58390-335b-40cc-8461-d931d3b22e41-scripts" (OuterVolumeSpecName: "scripts") pod "2de58390-335b-40cc-8461-d931d3b22e41" (UID: "2de58390-335b-40cc-8461-d931d3b22e41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.972535 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de58390-335b-40cc-8461-d931d3b22e41-kube-api-access-bpfnw" (OuterVolumeSpecName: "kube-api-access-bpfnw") pod "2de58390-335b-40cc-8461-d931d3b22e41" (UID: "2de58390-335b-40cc-8461-d931d3b22e41"). InnerVolumeSpecName "kube-api-access-bpfnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.990899 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-r8xbm"] Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002498 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002602 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002617 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002628 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002641 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002653 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002662 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002671 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002680 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002689 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002698 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002810 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002855 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002873 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002898 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002913 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002928 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002956 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002968 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002980 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002993 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.003008 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.007474 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2de58390-335b-40cc-8461-d931d3b22e41" (UID: "2de58390-335b-40cc-8461-d931d3b22e41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.013311 4816 generic.go:334] "Generic (PLEG): container finished" podID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerID="9fc317ca9311d71a32a61a06236255eddc3a32782036027513c2583e902eb2de" exitCode=143 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.015036 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d28745d2-082d-4c99-90f0-b6c4696fb1a2","Type":"ContainerDied","Data":"9fc317ca9311d71a32a61a06236255eddc3a32782036027513c2583e902eb2de"} Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.026374 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:22 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:22 crc kubenswrapper[4816]: Mar 11 12:22:22 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:22 crc kubenswrapper[4816]: Mar 11 12:22:22 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:22 crc kubenswrapper[4816]: Mar 11 12:22:22 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:22 crc kubenswrapper[4816]: Mar 11 12:22:22 crc kubenswrapper[4816]: if [ -n "cinder" ]; then Mar 11 12:22:22 crc kubenswrapper[4816]: GRANT_DATABASE="cinder" Mar 11 12:22:22 crc kubenswrapper[4816]: else Mar 11 12:22:22 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:22 crc kubenswrapper[4816]: fi Mar 11 12:22:22 crc kubenswrapper[4816]: Mar 11 12:22:22 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:22 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:22 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:22 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:22 crc kubenswrapper[4816]: # support updates Mar 11 12:22:22 crc kubenswrapper[4816]: Mar 11 12:22:22 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.027582 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-4bcf-account-create-update-nv5hk" podUID="b1dd25da-51d6-45f0-b70c-f1baa17d2da3" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.059446 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "2de58390-335b-40cc-8461-d931d3b22e41" (UID: "2de58390-335b-40cc-8461-d931d3b22e41"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067188 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-nb\") pod \"32a279c7-00a8-4e98-8356-91e219416a22\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067244 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-config\") pod \"32a279c7-00a8-4e98-8356-91e219416a22\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067581 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-combined-ca-bundle\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067626 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-metrics-certs-tls-certs\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067693 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-config\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067787 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-scripts\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067818 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdbserver-sb-tls-certs\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067874 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-sb\") pod \"32a279c7-00a8-4e98-8356-91e219416a22\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067927 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-combined-ca-bundle\") pod \"502b3843-8246-4715-9735-dfc0336caacb\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067995 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhzf2\" (UniqueName: \"kubernetes.io/projected/502b3843-8246-4715-9735-dfc0336caacb-kube-api-access-xhzf2\") pod \"502b3843-8246-4715-9735-dfc0336caacb\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068047 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-openstack-config-secret\") pod \"502b3843-8246-4715-9735-dfc0336caacb\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068078 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/502b3843-8246-4715-9735-dfc0336caacb-openstack-config\") pod \"502b3843-8246-4715-9735-dfc0336caacb\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068105 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-svc\") pod \"32a279c7-00a8-4e98-8356-91e219416a22\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068167 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wmq6\" (UniqueName: \"kubernetes.io/projected/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-kube-api-access-6wmq6\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068245 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdb-rundir\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068314 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-swift-storage-0\") pod \"32a279c7-00a8-4e98-8356-91e219416a22\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068390 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfg5z\" (UniqueName: \"kubernetes.io/projected/32a279c7-00a8-4e98-8356-91e219416a22-kube-api-access-jfg5z\") pod \"32a279c7-00a8-4e98-8356-91e219416a22\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068437 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068812 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-config" (OuterVolumeSpecName: "config") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069083 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2de58390-335b-40cc-8461-d931d3b22e41-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069161 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069220 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069399 4816 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069562 4816 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069622 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069688 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpfnw\" (UniqueName: \"kubernetes.io/projected/2de58390-335b-40cc-8461-d931d3b22e41-kube-api-access-bpfnw\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069744 4816 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069925 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.074471 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-scripts" (OuterVolumeSpecName: "scripts") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.086240 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.086335 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/502b3843-8246-4715-9735-dfc0336caacb-kube-api-access-xhzf2" (OuterVolumeSpecName: "kube-api-access-xhzf2") pod "502b3843-8246-4715-9735-dfc0336caacb" (UID: "502b3843-8246-4715-9735-dfc0336caacb"). InnerVolumeSpecName "kube-api-access-xhzf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.086411 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-kube-api-access-6wmq6" (OuterVolumeSpecName: "kube-api-access-6wmq6") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "kube-api-access-6wmq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.091010 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerName="rabbitmq" containerID="cri-o://18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f" gracePeriod=604800 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.091433 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a279c7-00a8-4e98-8356-91e219416a22-kube-api-access-jfg5z" (OuterVolumeSpecName: "kube-api-access-jfg5z") pod "32a279c7-00a8-4e98-8356-91e219416a22" (UID: "32a279c7-00a8-4e98-8356-91e219416a22"). InnerVolumeSpecName "kube-api-access-jfg5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.124446 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "32a279c7-00a8-4e98-8356-91e219416a22" (UID: "32a279c7-00a8-4e98-8356-91e219416a22"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.146814 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ec4faaf-e219-4b01-b3b9-0d6757a38154" path="/var/lib/kubelet/pods/1ec4faaf-e219-4b01-b3b9-0d6757a38154/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.147902 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36fadc66-c846-46c0-a002-efeb7656f2b8" path="/var/lib/kubelet/pods/36fadc66-c846-46c0-a002-efeb7656f2b8/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.148932 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f519dc2-e88b-4e4b-9637-c3e172b81bfa" path="/var/lib/kubelet/pods/3f519dc2-e88b-4e4b-9637-c3e172b81bfa/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.149910 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="403fec7f-c194-4bdd-a620-34aefb5d677c" path="/var/lib/kubelet/pods/403fec7f-c194-4bdd-a620-34aefb5d677c/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.153563 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6268fe92-5c93-43c7-95bc-f30befda5d65" path="/var/lib/kubelet/pods/6268fe92-5c93-43c7-95bc-f30befda5d65/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.154639 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="632c5d32-5370-401a-8202-58e0ec70f357" path="/var/lib/kubelet/pods/632c5d32-5370-401a-8202-58e0ec70f357/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.155518 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66951176-170f-4d49-9a92-aeeb66f4a79c" path="/var/lib/kubelet/pods/66951176-170f-4d49-9a92-aeeb66f4a79c/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.157490 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c9952da-6281-45f2-8b45-30caa27b8d39" path="/var/lib/kubelet/pods/7c9952da-6281-45f2-8b45-30caa27b8d39/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.158322 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fca72cd-9caa-4029-8c20-1623a315702d" path="/var/lib/kubelet/pods/7fca72cd-9caa-4029-8c20-1623a315702d/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.159503 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d3e7fa1-3f66-495b-be44-cf97eec043c1" path="/var/lib/kubelet/pods/8d3e7fa1-3f66-495b-be44-cf97eec043c1/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.160979 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91cdfd54-2ee7-490e-bf3f-563406e59cda" path="/var/lib/kubelet/pods/91cdfd54-2ee7-490e-bf3f-563406e59cda/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.161859 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fd32333-bdaa-461b-ac10-324291d1e5d3" path="/var/lib/kubelet/pods/9fd32333-bdaa-461b-ac10-324291d1e5d3/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.162652 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e0ff63-3d12-4174-9341-ceb21109e000" path="/var/lib/kubelet/pods/a0e0ff63-3d12-4174-9341-ceb21109e000/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.163990 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a25abdc0-8516-4747-a589-78db9bc64ca3" path="/var/lib/kubelet/pods/a25abdc0-8516-4747-a589-78db9bc64ca3/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.164791 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac" path="/var/lib/kubelet/pods/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.167611 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6745bae-b403-4a86-9148-8baecc00f8b1" path="/var/lib/kubelet/pods/b6745bae-b403-4a86-9148-8baecc00f8b1/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.168742 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" path="/var/lib/kubelet/pods/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.171458 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.171501 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhzf2\" (UniqueName: \"kubernetes.io/projected/502b3843-8246-4715-9735-dfc0336caacb-kube-api-access-xhzf2\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.171512 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wmq6\" (UniqueName: \"kubernetes.io/projected/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-kube-api-access-6wmq6\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.171521 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.171532 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfg5z\" (UniqueName: \"kubernetes.io/projected/32a279c7-00a8-4e98-8356-91e219416a22-kube-api-access-jfg5z\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.171564 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.171574 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.192348 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.274343 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.274463 4816 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.274728 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts podName:3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb nodeName:}" failed. No retries permitted until 2026-03-11 12:22:23.274706514 +0000 UTC m=+1429.865970481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts") pod "root-account-create-update-snf5b" (UID: "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb") : configmap "openstack-cell1-scripts" not found Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.385687 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/502b3843-8246-4715-9735-dfc0336caacb-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "502b3843-8246-4715-9735-dfc0336caacb" (UID: "502b3843-8246-4715-9735-dfc0336caacb"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.391602 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "502b3843-8246-4715-9735-dfc0336caacb" (UID: "502b3843-8246-4715-9735-dfc0336caacb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.444703 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-config" (OuterVolumeSpecName: "config") pod "32a279c7-00a8-4e98-8356-91e219416a22" (UID: "32a279c7-00a8-4e98-8356-91e219416a22"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.475805 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.478958 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "32a279c7-00a8-4e98-8356-91e219416a22" (UID: "32a279c7-00a8-4e98-8356-91e219416a22"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.488258 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.488294 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.488309 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.488320 4816 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/502b3843-8246-4715-9735-dfc0336caacb-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.491372 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.495182 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data podName:26aea2df-f497-478d-b953-060189ef2569 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:26.495135074 +0000 UTC m=+1433.086399041 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data") pod "rabbitmq-server-0" (UID: "26aea2df-f497-478d-b953-060189ef2569") : configmap "rabbitmq-config-data" not found Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.495215 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.552939 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32a279c7-00a8-4e98-8356-91e219416a22" (UID: "32a279c7-00a8-4e98-8356-91e219416a22"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.599473 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.599558 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.599612 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data podName:3779c0f5-9084-4c07-83d9-fe2017559f7b nodeName:}" failed. No retries permitted until 2026-03-11 12:22:24.599594314 +0000 UTC m=+1431.190858281 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b") : configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.626657 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "32a279c7-00a8-4e98-8356-91e219416a22" (UID: "32a279c7-00a8-4e98-8356-91e219416a22"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.695160 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "502b3843-8246-4715-9735-dfc0336caacb" (UID: "502b3843-8246-4715-9735-dfc0336caacb"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.704153 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.704189 4816 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.711752 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.779578 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.788214 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.788585 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.800019 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.800162 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="41f4b502-b85f-488c-b55b-27a31479df68" containerName="nova-scheduler-scheduler" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.812848 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e16e7d30-3235-44f2-81b4-c0c828071bbb/ovsdbserver-nb/0.log" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.812942 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.818688 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.823463 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.823518 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925492 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-etc-swift\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925557 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-combined-ca-bundle\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925636 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-combined-ca-bundle\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925740 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-scripts\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925832 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-log-httpd\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925890 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdbserver-nb-tls-certs\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925918 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-metrics-certs-tls-certs\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925977 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-config-data\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926002 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddtnh\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-kube-api-access-ddtnh\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926130 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926178 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdb-rundir\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926205 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-public-tls-certs\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926260 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926346 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-run-httpd\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926384 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4nnr\" (UniqueName: \"kubernetes.io/projected/e16e7d30-3235-44f2-81b4-c0c828071bbb-kube-api-access-r4nnr\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926412 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-config\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.928612 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.928776 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-config" (OuterVolumeSpecName: "config") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.929770 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-scripts" (OuterVolumeSpecName: "scripts") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.933182 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.933782 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.934513 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.937596 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-kube-api-access-ddtnh" (OuterVolumeSpecName: "kube-api-access-ddtnh") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "kube-api-access-ddtnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.945673 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e16e7d30-3235-44f2-81b4-c0c828071bbb-kube-api-access-r4nnr" (OuterVolumeSpecName: "kube-api-access-r4nnr") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "kube-api-access-r4nnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.952636 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.980368 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.984905 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030464 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030501 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030512 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030520 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddtnh\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-kube-api-access-ddtnh\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030553 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030562 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030572 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030579 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4nnr\" (UniqueName: \"kubernetes.io/projected/e16e7d30-3235-44f2-81b4-c0c828071bbb-kube-api-access-r4nnr\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030587 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030597 4816 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.032618 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.061522 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.080567 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.082980 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bvvkj"] Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084020 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-httpd" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084041 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-httpd" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084064 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91cdfd54-2ee7-490e-bf3f-563406e59cda" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084071 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="91cdfd54-2ee7-490e-bf3f-563406e59cda" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084081 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084089 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084101 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd796be0-d1ac-47be-8162-3b1c42febc0a" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084108 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd796be0-d1ac-47be-8162-3b1c42febc0a" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084123 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a279c7-00a8-4e98-8356-91e219416a22" containerName="init" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084129 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a279c7-00a8-4e98-8356-91e219416a22" containerName="init" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084149 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de58390-335b-40cc-8461-d931d3b22e41" containerName="ovn-controller" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084160 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de58390-335b-40cc-8461-d931d3b22e41" containerName="ovn-controller" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084175 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a279c7-00a8-4e98-8356-91e219416a22" containerName="dnsmasq-dns" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084183 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a279c7-00a8-4e98-8356-91e219416a22" containerName="dnsmasq-dns" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084198 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="ovsdbserver-nb" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084204 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="ovsdbserver-nb" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084212 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084219 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084232 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-server" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084238 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-server" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084271 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="ovsdbserver-sb" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084278 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="ovsdbserver-sb" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.085296 4816 generic.go:334] "Generic (PLEG): container finished" podID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerID="526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000" exitCode=0 Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.085456 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088569 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088887 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd796be0-d1ac-47be-8162-3b1c42febc0a" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088909 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de58390-335b-40cc-8461-d931d3b22e41" containerName="ovn-controller" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088931 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088945 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-server" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088955 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-httpd" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088964 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="ovsdbserver-sb" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088981 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="ovsdbserver-nb" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088996 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="91cdfd54-2ee7-490e-bf3f-563406e59cda" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.089012 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a279c7-00a8-4e98-8356-91e219416a22" containerName="dnsmasq-dns" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.089212 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.090531 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c5b6658f-tdgsh" event={"ID":"3e6d90d2-e7e3-4245-b3a6-042621e01a67","Type":"ContainerDied","Data":"526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.090565 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c5b6658f-tdgsh" event={"ID":"3e6d90d2-e7e3-4245-b3a6-042621e01a67","Type":"ContainerDied","Data":"78085f7a145fe8f236757523ada7ae443e7b3ab85638d5063fc54c7855365882"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.090580 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8" event={"ID":"2de58390-335b-40cc-8461-d931d3b22e41","Type":"ContainerDied","Data":"90ffa1dacc5321713c5d44a9d616add617a25ab1efffcadfb14af28f07cc7bbd"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.090602 4816 scope.go:117] "RemoveContainer" containerID="526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.090692 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.093022 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.109964 4816 generic.go:334] "Generic (PLEG): container finished" podID="fd796be0-d1ac-47be-8162-3b1c42febc0a" containerID="de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e" exitCode=0 Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.111074 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.114104 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fd796be0-d1ac-47be-8162-3b1c42febc0a","Type":"ContainerDied","Data":"de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.114220 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fd796be0-d1ac-47be-8162-3b1c42febc0a","Type":"ContainerDied","Data":"73799c30d5d3ab5fe26ad3cf5939299dea4d34493e455f7bcdac484f34941957"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.122913 4816 scope.go:117] "RemoveContainer" containerID="ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.123734 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4a8d-account-create-update-2lrkx" event={"ID":"c47c9b57-0735-415f-a1a1-4b3096e3fbcf","Type":"ContainerStarted","Data":"907e6d1395bfe6aa206c07c8f0ecff9b2205c70baa8db02eecf5138662995725"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.127359 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bvvkj"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.137324 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-nova-novncproxy-tls-certs\") pod \"fd796be0-d1ac-47be-8162-3b1c42febc0a\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.138298 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-combined-ca-bundle\") pod \"fd796be0-d1ac-47be-8162-3b1c42febc0a\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.138356 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkd68\" (UniqueName: \"kubernetes.io/projected/fd796be0-d1ac-47be-8162-3b1c42febc0a-kube-api-access-jkd68\") pod \"fd796be0-d1ac-47be-8162-3b1c42febc0a\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.138499 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data\") pod \"fd796be0-d1ac-47be-8162-3b1c42febc0a\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.138552 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-vencrypt-tls-certs\") pod \"fd796be0-d1ac-47be-8162-3b1c42febc0a\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.139591 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.139634 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.139644 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.140695 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-963f-account-create-update-9hnkv" event={"ID":"5e637fcd-e45c-479c-856d-086d642af3bb","Type":"ContainerStarted","Data":"4f023b2d8c3517ea66e0705887fb61a09310d77cc0b2edae0368635152923da3"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.166504 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd796be0-d1ac-47be-8162-3b1c42febc0a-kube-api-access-jkd68" (OuterVolumeSpecName: "kube-api-access-jkd68") pod "fd796be0-d1ac-47be-8162-3b1c42febc0a" (UID: "fd796be0-d1ac-47be-8162-3b1c42febc0a"). InnerVolumeSpecName "kube-api-access-jkd68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.173971 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-84rn8"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.191259 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2" exitCode=0 Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.191304 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db" exitCode=0 Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.191318 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000" exitCode=0 Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.191428 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.191476 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.191504 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.194820 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-config-data" (OuterVolumeSpecName: "config-data") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.211414 4816 scope.go:117] "RemoveContainer" containerID="526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.213065 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000\": container with ID starting with 526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000 not found: ID does not exist" containerID="526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.213106 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000"} err="failed to get container status \"526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000\": rpc error: code = NotFound desc = could not find container \"526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000\": container with ID starting with 526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000 not found: ID does not exist" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.213125 4816 scope.go:117] "RemoveContainer" containerID="ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.220355 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff\": container with ID starting with ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff not found: ID does not exist" containerID="ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.220387 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff"} err="failed to get container status \"ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff\": rpc error: code = NotFound desc = could not find container \"ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff\": container with ID starting with ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff not found: ID does not exist" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.220411 4816 scope.go:117] "RemoveContainer" containerID="b67798b7f6eede8770ea6cbb3808f928e4bdbe9cdbf08abe0db324318159dd17" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.224558 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.224665 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" event={"ID":"32a279c7-00a8-4e98-8356-91e219416a22","Type":"ContainerDied","Data":"88f0e5edf59a2c15eb9814f01d499e770f690a88f8bf62d0decdbb14e939c9e6"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.241474 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.246813 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.247190 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data" (OuterVolumeSpecName: "config-data") pod "fd796be0-d1ac-47be-8162-3b1c42febc0a" (UID: "fd796be0-d1ac-47be-8162-3b1c42febc0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.250023 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data\") pod \"fd796be0-d1ac-47be-8162-3b1c42febc0a\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " Mar 11 12:22:23 crc kubenswrapper[4816]: W0311 12:22:23.250176 4816 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/fd796be0-d1ac-47be-8162-3b1c42febc0a/volumes/kubernetes.io~secret/config-data Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.250201 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data" (OuterVolumeSpecName: "config-data") pod "fd796be0-d1ac-47be-8162-3b1c42febc0a" (UID: "fd796be0-d1ac-47be-8162-3b1c42febc0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.250556 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.252134 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klbjm\" (UniqueName: \"kubernetes.io/projected/2d60557e-d939-46bf-8a60-641016b4d68d-kube-api-access-klbjm\") pod \"root-account-create-update-bvvkj\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: W0311 12:22:23.252287 4816 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/3e6d90d2-e7e3-4245-b3a6-042621e01a67/volumes/kubernetes.io~secret/internal-tls-certs Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.252307 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.269430 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d60557e-d939-46bf-8a60-641016b4d68d-operator-scripts\") pod \"root-account-create-update-bvvkj\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.269933 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.270794 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkd68\" (UniqueName: \"kubernetes.io/projected/fd796be0-d1ac-47be-8162-3b1c42febc0a-kube-api-access-jkd68\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.270894 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.270972 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.271047 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.274951 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd796be0-d1ac-47be-8162-3b1c42febc0a" (UID: "fd796be0-d1ac-47be-8162-3b1c42febc0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.290587 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-84rn8"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.301169 4816 scope.go:117] "RemoveContainer" containerID="de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.301500 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "fd796be0-d1ac-47be-8162-3b1c42febc0a" (UID: "fd796be0-d1ac-47be-8162-3b1c42febc0a"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.304511 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "fd796be0-d1ac-47be-8162-3b1c42febc0a" (UID: "fd796be0-d1ac-47be-8162-3b1c42febc0a"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.331787 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e16e7d30-3235-44f2-81b4-c0c828071bbb/ovsdbserver-nb/0.log" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.331872 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16e7d30-3235-44f2-81b4-c0c828071bbb","Type":"ContainerDied","Data":"33dcb516fa17b7c432ef1e2b1650ba4d2e9f946dd76257f934af302a386a7dbf"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.331994 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.345856 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fe419fb1-1901-4fd4-9d9c-8884651e3ad9/ovsdbserver-sb/0.log" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.345942 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe419fb1-1901-4fd4-9d9c-8884651e3ad9","Type":"ContainerDied","Data":"856ecaff8a78617160b7f62ce0d1169e3c52ef425eb093d777cccb4f585957a7"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.346056 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.361387 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.365374 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-7h7r8"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.370115 4816 scope.go:117] "RemoveContainer" containerID="de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.370374 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-7h7r8"] Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.370562 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e\": container with ID starting with de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e not found: ID does not exist" containerID="de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.370604 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e"} err="failed to get container status \"de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e\": rpc error: code = NotFound desc = could not find container \"de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e\": container with ID starting with de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e not found: ID does not exist" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.370637 4816 scope.go:117] "RemoveContainer" containerID="373cac1249bba137b237fe973a3b7880bfcca6318c8db162f6ca4526fa918835" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.373215 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klbjm\" (UniqueName: \"kubernetes.io/projected/2d60557e-d939-46bf-8a60-641016b4d68d-kube-api-access-klbjm\") pod \"root-account-create-update-bvvkj\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.373285 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d60557e-d939-46bf-8a60-641016b4d68d-operator-scripts\") pod \"root-account-create-update-bvvkj\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.373404 4816 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.373418 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.373429 4816 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.373439 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.374508 4816 generic.go:334] "Generic (PLEG): container finished" podID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" exitCode=0 Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.374608 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tnhfq" event={"ID":"edc01aa4-013d-4d10-9f22-e5f319e6c1a3","Type":"ContainerDied","Data":"e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c"} Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.374852 4816 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.374881 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d60557e-d939-46bf-8a60-641016b4d68d-operator-scripts\") pod \"root-account-create-update-bvvkj\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.374901 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts podName:3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb nodeName:}" failed. No retries permitted until 2026-03-11 12:22:25.374887158 +0000 UTC m=+1431.966151125 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts") pod "root-account-create-update-snf5b" (UID: "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb") : configmap "openstack-cell1-scripts" not found Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.400362 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klbjm\" (UniqueName: \"kubernetes.io/projected/2d60557e-d939-46bf-8a60-641016b4d68d-kube-api-access-klbjm\") pod \"root-account-create-update-bvvkj\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.401943 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.412454 4816 generic.go:334] "Generic (PLEG): container finished" podID="594ad696-b727-4153-979f-d32ccdc1fe83" containerID="4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8" exitCode=0 Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.412843 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"594ad696-b727-4153-979f-d32ccdc1fe83","Type":"ContainerDied","Data":"4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.420267 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.434837 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.442314 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.456602 4816 scope.go:117] "RemoveContainer" containerID="1876def9a0f72b0ad981ff600f29fd745c0daa03affcc6a0a2083718b834badc" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.456787 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6c5b6658f-tdgsh"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.467473 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6c5b6658f-tdgsh"] Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.673656 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.675344 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.681579 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.681666 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="ovn-northd" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.838601 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.871963 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.874778 4816 scope.go:117] "RemoveContainer" containerID="5e227ce28f5de77017097c97e0a28037dfd14090da88c0fa20d1f53e10f8268b" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.908753 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.951409 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.961001 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.965231 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.004630 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-operator-scripts\") pod \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.004915 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8msfc\" (UniqueName: \"kubernetes.io/projected/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-kube-api-access-8msfc\") pod \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.006174 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c47c9b57-0735-415f-a1a1-4b3096e3fbcf" (UID: "c47c9b57-0735-415f-a1a1-4b3096e3fbcf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.008471 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.019559 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-kube-api-access-8msfc" (OuterVolumeSpecName: "kube-api-access-8msfc") pod "c47c9b57-0735-415f-a1a1-4b3096e3fbcf" (UID: "c47c9b57-0735-415f-a1a1-4b3096e3fbcf"). InnerVolumeSpecName "kube-api-access-8msfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.058672 4816 scope.go:117] "RemoveContainer" containerID="e36d52352569b57940dd2cebcd565fb31e6c049d444d2da7c54f0fe9d882c7f6" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.110384 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e637fcd-e45c-479c-856d-086d642af3bb-operator-scripts\") pod \"5e637fcd-e45c-479c-856d-086d642af3bb\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.110754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn92x\" (UniqueName: \"kubernetes.io/projected/5e637fcd-e45c-479c-856d-086d642af3bb-kube-api-access-hn92x\") pod \"5e637fcd-e45c-479c-856d-086d642af3bb\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.111145 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e637fcd-e45c-479c-856d-086d642af3bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e637fcd-e45c-479c-856d-086d642af3bb" (UID: "5e637fcd-e45c-479c-856d-086d642af3bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.113185 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8msfc\" (UniqueName: \"kubernetes.io/projected/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-kube-api-access-8msfc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.113713 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e637fcd-e45c-479c-856d-086d642af3bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.125504 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e637fcd-e45c-479c-856d-086d642af3bb-kube-api-access-hn92x" (OuterVolumeSpecName: "kube-api-access-hn92x") pod "5e637fcd-e45c-479c-856d-086d642af3bb" (UID: "5e637fcd-e45c-479c-856d-086d642af3bb"). InnerVolumeSpecName "kube-api-access-hn92x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.144121 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de58390-335b-40cc-8461-d931d3b22e41" path="/var/lib/kubelet/pods/2de58390-335b-40cc-8461-d931d3b22e41/volumes" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.144963 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a279c7-00a8-4e98-8356-91e219416a22" path="/var/lib/kubelet/pods/32a279c7-00a8-4e98-8356-91e219416a22/volumes" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.145796 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" path="/var/lib/kubelet/pods/3e6d90d2-e7e3-4245-b3a6-042621e01a67/volumes" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.146685 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="502b3843-8246-4715-9735-dfc0336caacb" path="/var/lib/kubelet/pods/502b3843-8246-4715-9735-dfc0336caacb/volumes" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.147368 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" path="/var/lib/kubelet/pods/e16e7d30-3235-44f2-81b4-c0c828071bbb/volumes" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.148542 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd796be0-d1ac-47be-8162-3b1c42febc0a" path="/var/lib/kubelet/pods/fd796be0-d1ac-47be-8162-3b1c42febc0a/volumes" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.149802 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" path="/var/lib/kubelet/pods/fe419fb1-1901-4fd4-9d9c-8884651e3ad9/volumes" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.191890 4816 scope.go:117] "RemoveContainer" containerID="4c01622c11d3f3812a2eae31ec2decc063cf1fe9d275e29cfb942cdc480ba8db" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.210520 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.217860 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data-custom\") pod \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.217913 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-combined-ca-bundle\") pod \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.217934 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl4t9\" (UniqueName: \"kubernetes.io/projected/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-kube-api-access-tl4t9\") pod \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.217963 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data\") pod \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.218017 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-logs\") pod \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.218363 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn92x\" (UniqueName: \"kubernetes.io/projected/5e637fcd-e45c-479c-856d-086d642af3bb-kube-api-access-hn92x\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.227731 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.172:8776/healthcheck\": read tcp 10.217.0.2:41490->10.217.0.172:8776: read: connection reset by peer" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.241561 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.241966 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-central-agent" containerID="cri-o://84c64e2c11b5a33088d3e50d684b62246b9937fb898429fa525cc6fb739d9015" gracePeriod=30 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.242212 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="proxy-httpd" containerID="cri-o://a80048ca909856187d3fa5dac7b542ba5ca3c8dbcb582537e0f884c753db4809" gracePeriod=30 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.242304 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="sg-core" containerID="cri-o://824fd644293ef663ba362cace1b788aa52143866b3de49d3b2f15202714957b5" gracePeriod=30 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.242348 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-notification-agent" containerID="cri-o://a8b3b1241d87a2bc94cda4c45011262eeb879b9fb212362f754599d92ce27242" gracePeriod=30 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.264017 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-logs" (OuterVolumeSpecName: "logs") pod "ddd535a1-7585-4cb7-94ec-f4b98b10be4a" (UID: "ddd535a1-7585-4cb7-94ec-f4b98b10be4a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.316533 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.317002 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="32dcc96b-186a-444d-bef3-4c5f117ee652" containerName="kube-state-metrics" containerID="cri-o://87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c" gracePeriod=30 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.317523 4816 scope.go:117] "RemoveContainer" containerID="ee8f2b910a2d52b32d76649fbccb57d3440b0a1d624504112ddbe71af6ca7889" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.318482 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-kube-api-access-tl4t9" (OuterVolumeSpecName: "kube-api-access-tl4t9") pod "ddd535a1-7585-4cb7-94ec-f4b98b10be4a" (UID: "ddd535a1-7585-4cb7-94ec-f4b98b10be4a"). InnerVolumeSpecName "kube-api-access-tl4t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.319772 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl4t9\" (UniqueName: \"kubernetes.io/projected/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-kube-api-access-tl4t9\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.319791 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.322567 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ddd535a1-7585-4cb7-94ec-f4b98b10be4a" (UID: "ddd535a1-7585-4cb7-94ec-f4b98b10be4a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.348180 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.350125 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.350810 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.350857 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.397666 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.410600 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data" (OuterVolumeSpecName: "config-data") pod "ddd535a1-7585-4cb7-94ec-f4b98b10be4a" (UID: "ddd535a1-7585-4cb7-94ec-f4b98b10be4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.436908 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.436933 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.454848 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddd535a1-7585-4cb7-94ec-f4b98b10be4a" (UID: "ddd535a1-7585-4cb7-94ec-f4b98b10be4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.487130 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.497802 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9b21-account-create-update-r8vgg"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.550546 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.563004 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.563456 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="5030028c-f574-4334-a837-2430761524b4" containerName="memcached" containerID="cri-o://0b4c4c1c298f57878044bac49cc49a719acfc3a0f87a1803c19c539d85446637" gracePeriod=30 Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.600409 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.600896 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.624643 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9b21-account-create-update-r8vgg"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.635880 4816 generic.go:334] "Generic (PLEG): container finished" podID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerID="c04dc0a2663851eac8a9c1faccfd79cf6c27fbce470c4ad0b7499358caea8a06" exitCode=0 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.635989 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c94c19c-3ccb-43cc-ab41-92baa3141f73","Type":"ContainerDied","Data":"c04dc0a2663851eac8a9c1faccfd79cf6c27fbce470c4ad0b7499358caea8a06"} Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.639872 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9b21-account-create-update-cmcfl"] Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.641019 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener-log" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.641039 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener-log" Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.641055 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.641062 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.641287 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener-log" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.641302 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.642118 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.655963 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.656033 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data podName:3779c0f5-9084-4c07-83d9-fe2017559f7b nodeName:}" failed. No retries permitted until 2026-03-11 12:22:28.656011939 +0000 UTC m=+1435.247275906 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b") : configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.665589 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.666533 4816 generic.go:334] "Generic (PLEG): container finished" podID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerID="a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207" exitCode=0 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.666646 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" event={"ID":"ddd535a1-7585-4cb7-94ec-f4b98b10be4a","Type":"ContainerDied","Data":"a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207"} Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.666686 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" event={"ID":"ddd535a1-7585-4cb7-94ec-f4b98b10be4a","Type":"ContainerDied","Data":"84c045541bc73afd53de86393645863c006080b89347feb36d269d40b0b6ac28"} Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.666763 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.667842 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-w8rqc"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.690783 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9b21-account-create-update-cmcfl"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.706427 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kbmsk"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.722592 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-w8rqc"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.741483 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.741493 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4a8d-account-create-update-2lrkx" event={"ID":"c47c9b57-0735-415f-a1a1-4b3096e3fbcf","Type":"ContainerDied","Data":"907e6d1395bfe6aa206c07c8f0ecff9b2205c70baa8db02eecf5138662995725"} Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.759374 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.759469 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24z58\" (UniqueName: \"kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.761510 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kbmsk"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.782575 4816 generic.go:334] "Generic (PLEG): container finished" podID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerID="6309388e250c5434fd6b39ddce96cacd594c9880dd57d2c9e89074cac30a961b" exitCode=0 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.782765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6fb588-7hftz" event={"ID":"7bd939d8-3b22-4496-acea-ac527f3e5149","Type":"ContainerDied","Data":"6309388e250c5434fd6b39ddce96cacd594c9880dd57d2c9e89074cac30a961b"} Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.793120 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5d6ddcd789-qjf9c"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.808661 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-5d6ddcd789-qjf9c" podUID="9c180505-72c6-498d-bfa5-05f689692bd2" containerName="keystone-api" containerID="cri-o://40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8" gracePeriod=30 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.819759 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.841041 4816 generic.go:334] "Generic (PLEG): container finished" podID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerID="08358819a244a822957b7c7153f37ef3fa2c0371fe913be221e0cf6e09e89054" exitCode=0 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.841156 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a22173f-147b-46ac-bb01-596fe9f12b10","Type":"ContainerDied","Data":"08358819a244a822957b7c7153f37ef3fa2c0371fe913be221e0cf6e09e89054"} Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.844326 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-963f-account-create-update-9hnkv" event={"ID":"5e637fcd-e45c-479c-856d-086d642af3bb","Type":"ContainerDied","Data":"4f023b2d8c3517ea66e0705887fb61a09310d77cc0b2edae0368635152923da3"} Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.844372 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.861751 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9b21-account-create-update-cmcfl"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.866602 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.866759 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24z58\" (UniqueName: \"kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.867536 4816 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.867602 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts podName:1da70aee-e1eb-4ad5-b0de-1e2f988dd729 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:25.367577625 +0000 UTC m=+1431.958841592 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts") pod "keystone-9b21-account-create-update-cmcfl" (UID: "1da70aee-e1eb-4ad5-b0de-1e2f988dd729") : configmap "openstack-scripts" not found Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.913824 4816 projected.go:194] Error preparing data for projected volume kube-api-access-24z58 for pod openstack/keystone-9b21-account-create-update-cmcfl: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.913938 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58 podName:1da70aee-e1eb-4ad5-b0de-1e2f988dd729 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:25.413912605 +0000 UTC m=+1432.005176572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-24z58" (UniqueName: "kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58") pod "keystone-9b21-account-create-update-cmcfl" (UID: "1da70aee-e1eb-4ad5-b0de-1e2f988dd729") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.928155 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-rmcqp"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.951261 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-rmcqp"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.983384 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bvvkj"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.993575 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bvvkj"] Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.049547 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:25 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:25 crc kubenswrapper[4816]: Mar 11 12:22:25 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:25 crc kubenswrapper[4816]: Mar 11 12:22:25 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:25 crc kubenswrapper[4816]: Mar 11 12:22:25 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:25 crc kubenswrapper[4816]: Mar 11 12:22:25 crc kubenswrapper[4816]: if [ -n "" ]; then Mar 11 12:22:25 crc kubenswrapper[4816]: GRANT_DATABASE="" Mar 11 12:22:25 crc kubenswrapper[4816]: else Mar 11 12:22:25 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:25 crc kubenswrapper[4816]: fi Mar 11 12:22:25 crc kubenswrapper[4816]: Mar 11 12:22:25 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:25 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:25 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:25 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:25 crc kubenswrapper[4816]: # support updates Mar 11 12:22:25 crc kubenswrapper[4816]: Mar 11 12:22:25 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.053185 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-bvvkj" podUID="2d60557e-d939-46bf-8a60-641016b4d68d" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.133539 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": dial tcp 10.217.0.210:8775: connect: connection refused" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.133550 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": dial tcp 10.217.0.210:8775: connect: connection refused" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.224453 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="da177cde-6332-4562-809a-d4bee453cebf" containerName="galera" containerID="cri-o://c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" gracePeriod=30 Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.382760 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.382996 4816 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.383079 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts podName:3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb nodeName:}" failed. No retries permitted until 2026-03-11 12:22:29.383057307 +0000 UTC m=+1435.974321274 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts") pod "root-account-create-update-snf5b" (UID: "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb") : configmap "openstack-cell1-scripts" not found Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.383486 4816 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.383516 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts podName:1da70aee-e1eb-4ad5-b0de-1e2f988dd729 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:26.38350569 +0000 UTC m=+1432.974769657 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts") pod "keystone-9b21-account-create-update-cmcfl" (UID: "1da70aee-e1eb-4ad5-b0de-1e2f988dd729") : configmap "openstack-scripts" not found Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.487920 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24z58\" (UniqueName: \"kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.492576 4816 projected.go:194] Error preparing data for projected volume kube-api-access-24z58 for pod openstack/keystone-9b21-account-create-update-cmcfl: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.492881 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58 podName:1da70aee-e1eb-4ad5-b0de-1e2f988dd729 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:26.49285803 +0000 UTC m=+1433.084121997 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-24z58" (UniqueName: "kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58") pod "keystone-9b21-account-create-update-cmcfl" (UID: "1da70aee-e1eb-4ad5-b0de-1e2f988dd729") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.611838 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-24z58 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-9b21-account-create-update-cmcfl" podUID="1da70aee-e1eb-4ad5-b0de-1e2f988dd729" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.627935 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-59b4f4d478-5b797"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.637652 4816 scope.go:117] "RemoveContainer" containerID="fd6533a10f6d22b4d1d7a2a73ad8cc4591438b77aefeced48dbf3b4526cf28f0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.643874 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.671539 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-59b4f4d478-5b797"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.682585 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.685677 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.691664 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts\") pod \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.691882 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf9zw\" (UniqueName: \"kubernetes.io/projected/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-kube-api-access-vf9zw\") pod \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.695868 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb" (UID: "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.696446 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.706918 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.717831 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-kube-api-access-vf9zw" (OuterVolumeSpecName: "kube-api-access-vf9zw") pod "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb" (UID: "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb"). InnerVolumeSpecName "kube-api-access-vf9zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.721349 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-963f-account-create-update-9hnkv"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.722085 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.727997 4816 scope.go:117] "RemoveContainer" containerID="a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.734691 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-963f-account-create-update-9hnkv"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.764107 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4a8d-account-create-update-2lrkx"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.780195 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4a8d-account-create-update-2lrkx"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.793973 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-scripts\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794008 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-operator-scripts\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794062 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-config-data\") pod \"7bd939d8-3b22-4496-acea-ac527f3e5149\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794106 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-config-data\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794142 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-logs\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794209 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-default\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794240 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8b85\" (UniqueName: \"kubernetes.io/projected/e95ddca0-76d0-4dce-9983-4b07655adc25-kube-api-access-c8b85\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794281 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dlr4\" (UniqueName: \"kubernetes.io/projected/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-api-access-6dlr4\") pod \"32dcc96b-186a-444d-bef3-4c5f117ee652\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794312 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-kolla-config\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794333 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-config\") pod \"32dcc96b-186a-444d-bef3-4c5f117ee652\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794365 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-combined-ca-bundle\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794382 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-internal-tls-certs\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794423 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-combined-ca-bundle\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794444 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bd939d8-3b22-4496-acea-ac527f3e5149-logs\") pod \"7bd939d8-3b22-4496-acea-ac527f3e5149\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794472 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnzpb\" (UniqueName: \"kubernetes.io/projected/9a22173f-147b-46ac-bb01-596fe9f12b10-kube-api-access-cnzpb\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794489 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-public-tls-certs\") pod \"7bd939d8-3b22-4496-acea-ac527f3e5149\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794527 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-certs\") pod \"32dcc96b-186a-444d-bef3-4c5f117ee652\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794548 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbrs4\" (UniqueName: \"kubernetes.io/projected/7bd939d8-3b22-4496-acea-ac527f3e5149-kube-api-access-xbrs4\") pod \"7bd939d8-3b22-4496-acea-ac527f3e5149\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794571 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-combined-ca-bundle\") pod \"32dcc96b-186a-444d-bef3-4c5f117ee652\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794601 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-httpd-run\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794624 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-generated\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794652 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-combined-ca-bundle\") pod \"7bd939d8-3b22-4496-acea-ac527f3e5149\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794680 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794703 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-scripts\") pod \"7bd939d8-3b22-4496-acea-ac527f3e5149\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794725 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grm8x\" (UniqueName: \"kubernetes.io/projected/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-kube-api-access-grm8x\") pod \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794761 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794778 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-operator-scripts\") pod \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794803 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-galera-tls-certs\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794841 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-internal-tls-certs\") pod \"7bd939d8-3b22-4496-acea-ac527f3e5149\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.796218 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.796559 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.796960 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.796982 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf9zw\" (UniqueName: \"kubernetes.io/projected/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-kube-api-access-vf9zw\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.797001 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.797011 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.797095 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1dd25da-51d6-45f0-b70c-f1baa17d2da3" (UID: "b1dd25da-51d6-45f0-b70c-f1baa17d2da3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.797535 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bd939d8-3b22-4496-acea-ac527f3e5149-logs" (OuterVolumeSpecName: "logs") pod "7bd939d8-3b22-4496-acea-ac527f3e5149" (UID: "7bd939d8-3b22-4496-acea-ac527f3e5149"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.798193 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-logs" (OuterVolumeSpecName: "logs") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.805270 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-scripts" (OuterVolumeSpecName: "scripts") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.805289 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-scripts" (OuterVolumeSpecName: "scripts") pod "7bd939d8-3b22-4496-acea-ac527f3e5149" (UID: "7bd939d8-3b22-4496-acea-ac527f3e5149"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.805327 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.807897 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.808176 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.808521 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e95ddca0-76d0-4dce-9983-4b07655adc25-kube-api-access-c8b85" (OuterVolumeSpecName: "kube-api-access-c8b85") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "kube-api-access-c8b85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.810014 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.811681 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-kube-api-access-grm8x" (OuterVolumeSpecName: "kube-api-access-grm8x") pod "b1dd25da-51d6-45f0-b70c-f1baa17d2da3" (UID: "b1dd25da-51d6-45f0-b70c-f1baa17d2da3"). InnerVolumeSpecName "kube-api-access-grm8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.813043 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.824099 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.824493 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.834951 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-api-access-6dlr4" (OuterVolumeSpecName: "kube-api-access-6dlr4") pod "32dcc96b-186a-444d-bef3-4c5f117ee652" (UID: "32dcc96b-186a-444d-bef3-4c5f117ee652"). InnerVolumeSpecName "kube-api-access-6dlr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.837891 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a22173f-147b-46ac-bb01-596fe9f12b10-kube-api-access-cnzpb" (OuterVolumeSpecName: "kube-api-access-cnzpb") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "kube-api-access-cnzpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.846105 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd939d8-3b22-4496-acea-ac527f3e5149-kube-api-access-xbrs4" (OuterVolumeSpecName: "kube-api-access-xbrs4") pod "7bd939d8-3b22-4496-acea-ac527f3e5149" (UID: "7bd939d8-3b22-4496-acea-ac527f3e5149"). InnerVolumeSpecName "kube-api-access-xbrs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.850426 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32dcc96b-186a-444d-bef3-4c5f117ee652" (UID: "32dcc96b-186a-444d-bef3-4c5f117ee652"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.854452 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.895860 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.898598 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-combined-ca-bundle\") pod \"7d73d9d0-5632-47a3-93e0-899f64f51011\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.898788 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-config-data\") pod \"7d73d9d0-5632-47a3-93e0-899f64f51011\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.898833 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d73d9d0-5632-47a3-93e0-899f64f51011-logs\") pod \"7d73d9d0-5632-47a3-93e0-899f64f51011\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.898878 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-nova-metadata-tls-certs\") pod \"7d73d9d0-5632-47a3-93e0-899f64f51011\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.898937 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb22f\" (UniqueName: \"kubernetes.io/projected/7d73d9d0-5632-47a3-93e0-899f64f51011-kube-api-access-nb22f\") pod \"7d73d9d0-5632-47a3-93e0-899f64f51011\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899485 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899505 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899517 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grm8x\" (UniqueName: \"kubernetes.io/projected/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-kube-api-access-grm8x\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899535 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899545 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899554 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899564 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899572 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899582 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899593 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8b85\" (UniqueName: \"kubernetes.io/projected/e95ddca0-76d0-4dce-9983-4b07655adc25-kube-api-access-c8b85\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899603 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dlr4\" (UniqueName: \"kubernetes.io/projected/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-api-access-6dlr4\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899612 4816 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899622 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bd939d8-3b22-4496-acea-ac527f3e5149-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899632 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnzpb\" (UniqueName: \"kubernetes.io/projected/9a22173f-147b-46ac-bb01-596fe9f12b10-kube-api-access-cnzpb\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899641 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbrs4\" (UniqueName: \"kubernetes.io/projected/7bd939d8-3b22-4496-acea-ac527f3e5149-kube-api-access-xbrs4\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899650 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.906978 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.907072 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="63567eba-cc2a-4168-9e81-51c1daed5482" containerName="nova-cell1-conductor-conductor" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.913569 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d73d9d0-5632-47a3-93e0-899f64f51011-logs" (OuterVolumeSpecName: "logs") pod "7d73d9d0-5632-47a3-93e0-899f64f51011" (UID: "7d73d9d0-5632-47a3-93e0-899f64f51011"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.958736 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d73d9d0-5632-47a3-93e0-899f64f51011-kube-api-access-nb22f" (OuterVolumeSpecName: "kube-api-access-nb22f") pod "7d73d9d0-5632-47a3-93e0-899f64f51011" (UID: "7d73d9d0-5632-47a3-93e0-899f64f51011"). InnerVolumeSpecName "kube-api-access-nb22f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.963200 4816 generic.go:334] "Generic (PLEG): container finished" podID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerID="8ba3c9d212f5a9f10887e454eabe42340558258c07c8285eb982b69803aa3749" exitCode=0 Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.963460 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7457f2db-7979-4d92-bd90-a1464b8a3878","Type":"ContainerDied","Data":"8ba3c9d212f5a9f10887e454eabe42340558258c07c8285eb982b69803aa3749"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.965558 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4bcf-account-create-update-nv5hk" event={"ID":"b1dd25da-51d6-45f0-b70c-f1baa17d2da3","Type":"ContainerDied","Data":"1d267594122bbe4cf05c9b26645399ea847bc2099ad10ee8bb693c8e2675f8e5"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.965633 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.982854 4816 generic.go:334] "Generic (PLEG): container finished" podID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerID="9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764" exitCode=0 Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.982969 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e95ddca0-76d0-4dce-9983-4b07655adc25","Type":"ContainerDied","Data":"9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.983003 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e95ddca0-76d0-4dce-9983-4b07655adc25","Type":"ContainerDied","Data":"f13badcbc5010cfb4035a99958d3aebf412aeabedcc2f776bca112d761fa63de"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.983098 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.984433 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-snf5b" event={"ID":"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb","Type":"ContainerDied","Data":"37dc46cbca9b814e026266eb10b0888ee0b98d2b5a77de8a934c3e1d5742969a"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.984510 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.987651 4816 generic.go:334] "Generic (PLEG): container finished" podID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerID="05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2" exitCode=0 Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.987851 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.988440 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d73d9d0-5632-47a3-93e0-899f64f51011","Type":"ContainerDied","Data":"05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.988472 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d73d9d0-5632-47a3-93e0-899f64f51011","Type":"ContainerDied","Data":"c81825bf2b4be781ea36bdb64201016c8530a7353fa6d58d50264ccf72608bde"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.992517 4816 generic.go:334] "Generic (PLEG): container finished" podID="32dcc96b-186a-444d-bef3-4c5f117ee652" containerID="87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c" exitCode=2 Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.992611 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"32dcc96b-186a-444d-bef3-4c5f117ee652","Type":"ContainerDied","Data":"87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.992652 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"32dcc96b-186a-444d-bef3-4c5f117ee652","Type":"ContainerDied","Data":"1e343e65b4d8cc4645e88fc1c1a55d93ec648ea21d55e4018feab7481fc909e7"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.992743 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.995605 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bvvkj" event={"ID":"2d60557e-d939-46bf-8a60-641016b4d68d","Type":"ContainerStarted","Data":"8ea9826afd6446a559af78c72ed8d7f368b8a030b60ae6f7af907a7806773c5c"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.004581 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb22f\" (UniqueName: \"kubernetes.io/projected/7d73d9d0-5632-47a3-93e0-899f64f51011-kube-api-access-nb22f\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.004677 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d73d9d0-5632-47a3-93e0-899f64f51011-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025083 4816 generic.go:334] "Generic (PLEG): container finished" podID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerID="a80048ca909856187d3fa5dac7b542ba5ca3c8dbcb582537e0f884c753db4809" exitCode=0 Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025138 4816 generic.go:334] "Generic (PLEG): container finished" podID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerID="824fd644293ef663ba362cace1b788aa52143866b3de49d3b2f15202714957b5" exitCode=2 Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025147 4816 generic.go:334] "Generic (PLEG): container finished" podID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerID="a8b3b1241d87a2bc94cda4c45011262eeb879b9fb212362f754599d92ce27242" exitCode=0 Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025156 4816 generic.go:334] "Generic (PLEG): container finished" podID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerID="84c64e2c11b5a33088d3e50d684b62246b9937fb898429fa525cc6fb739d9015" exitCode=0 Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025224 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerDied","Data":"a80048ca909856187d3fa5dac7b542ba5ca3c8dbcb582537e0f884c753db4809"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025277 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerDied","Data":"824fd644293ef663ba362cace1b788aa52143866b3de49d3b2f15202714957b5"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025291 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerDied","Data":"a8b3b1241d87a2bc94cda4c45011262eeb879b9fb212362f754599d92ce27242"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025301 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerDied","Data":"84c64e2c11b5a33088d3e50d684b62246b9937fb898429fa525cc6fb739d9015"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.027851 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6fb588-7hftz" event={"ID":"7bd939d8-3b22-4496-acea-ac527f3e5149","Type":"ContainerDied","Data":"584cd4107522305bdba692719070a92eec3324ee2da427663b64c0c877cbea0c"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.027965 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.031849 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a22173f-147b-46ac-bb01-596fe9f12b10","Type":"ContainerDied","Data":"ef8afb38cbe161f1b81f860d56715a732c9c137776bc40df909c84b5acbd4154"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.031959 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.035272 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c94c19c-3ccb-43cc-ab41-92baa3141f73","Type":"ContainerDied","Data":"601d8bfb1ac6479d4e58832dfee18035d25eae3e88360d11ef1513118c0bd2f3"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.035305 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="601d8bfb1ac6479d4e58832dfee18035d25eae3e88360d11ef1513118c0bd2f3" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.039519 4816 generic.go:334] "Generic (PLEG): container finished" podID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerID="f7560f8d6f98f14204afbbce69a7ff86d5f07a2d1a84e68d20701b7c7e5ce84d" exitCode=0 Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.039605 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d28745d2-082d-4c99-90f0-b6c4696fb1a2","Type":"ContainerDied","Data":"f7560f8d6f98f14204afbbce69a7ff86d5f07a2d1a84e68d20701b7c7e5ce84d"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.039641 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d28745d2-082d-4c99-90f0-b6c4696fb1a2","Type":"ContainerDied","Data":"8a5fef237ae36daf657628ae1e951a8f33300f04ba146b0b7c82c1251a514014"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.039655 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a5fef237ae36daf657628ae1e951a8f33300f04ba146b0b7c82c1251a514014" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.051270 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.052438 4816 generic.go:334] "Generic (PLEG): container finished" podID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerID="4e741a528a024acf7a27b5a7253bef28cff4a22ea41c625ba24158e8c7be76eb" exitCode=0 Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.053410 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-855897fd55-t7sfb" event={"ID":"b79e89c6-5f56-4439-ad63-a86259d4ed29","Type":"ContainerDied","Data":"4e741a528a024acf7a27b5a7253bef28cff4a22ea41c625ba24158e8c7be76eb"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.057746 4816 generic.go:334] "Generic (PLEG): container finished" podID="7795071e-2de0-43cb-b225-cfed54570d94" containerID="5e19f1840cfd8f7623e64404579f814579ee6602ca765f964613a90342b26cc2" exitCode=0 Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.057828 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.058530 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b59f8d4-2vxd9" event={"ID":"7795071e-2de0-43cb-b225-cfed54570d94","Type":"ContainerDied","Data":"5e19f1840cfd8f7623e64404579f814579ee6602ca765f964613a90342b26cc2"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.058568 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b59f8d4-2vxd9" event={"ID":"7795071e-2de0-43cb-b225-cfed54570d94","Type":"ContainerDied","Data":"9de7e47c0f14568909f59552b05e938af6254c4c9840ec07004683a8c3fa16e2"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.058585 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9de7e47c0f14568909f59552b05e938af6254c4c9840ec07004683a8c3fa16e2" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.068492 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "32dcc96b-186a-444d-bef3-4c5f117ee652" (UID: "32dcc96b-186a-444d-bef3-4c5f117ee652"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.069093 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.095082 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.100548 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "32dcc96b-186a-444d-bef3-4c5f117ee652" (UID: "32dcc96b-186a-444d-bef3-4c5f117ee652"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.114336 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.114363 4816 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.114374 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.114384 4816 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.114395 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.137500 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-config-data" (OuterVolumeSpecName: "config-data") pod "7d73d9d0-5632-47a3-93e0-899f64f51011" (UID: "7d73d9d0-5632-47a3-93e0-899f64f51011"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.151972 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d73d9d0-5632-47a3-93e0-899f64f51011" (UID: "7d73d9d0-5632-47a3-93e0-899f64f51011"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.169655 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-config-data" (OuterVolumeSpecName: "config-data") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.170685 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.178722 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7d73d9d0-5632-47a3-93e0-899f64f51011" (UID: "7d73d9d0-5632-47a3-93e0-899f64f51011"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.179697 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ce1ef6-fcd0-4182-afca-22c5892b48e2" path="/var/lib/kubelet/pods/09ce1ef6-fcd0-4182-afca-22c5892b48e2/volumes" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.180611 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" path="/var/lib/kubelet/pods/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56/volumes" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.181387 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e637fcd-e45c-479c-856d-086d642af3bb" path="/var/lib/kubelet/pods/5e637fcd-e45c-479c-856d-086d642af3bb/volumes" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.183548 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="625b367b-084e-4cf8-8c30-5d4df9c696f9" path="/var/lib/kubelet/pods/625b367b-084e-4cf8-8c30-5d4df9c696f9/volumes" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.184454 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="742cfc03-0365-4df8-a7f6-e6eac11ba045" path="/var/lib/kubelet/pods/742cfc03-0365-4df8-a7f6-e6eac11ba045/volumes" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.185135 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c47c9b57-0735-415f-a1a1-4b3096e3fbcf" path="/var/lib/kubelet/pods/c47c9b57-0735-415f-a1a1-4b3096e3fbcf/volumes" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.185816 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" path="/var/lib/kubelet/pods/ddd535a1-7585-4cb7-94ec-f4b98b10be4a/volumes" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.194727 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-config-data" (OuterVolumeSpecName: "config-data") pod "7bd939d8-3b22-4496-acea-ac527f3e5149" (UID: "7bd939d8-3b22-4496-acea-ac527f3e5149"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.207797 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.218287 4816 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.218328 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.218337 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.218350 4816 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.218360 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.218370 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.218378 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.234419 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.310205 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7bd939d8-3b22-4496-acea-ac527f3e5149" (UID: "7bd939d8-3b22-4496-acea-ac527f3e5149"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.308158 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="26aea2df-f497-478d-b953-060189ef2569" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.327330 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.329066 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.331700 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.363069 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7bd939d8-3b22-4496-acea-ac527f3e5149" (UID: "7bd939d8-3b22-4496-acea-ac527f3e5149"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.363346 4816 scope.go:117] "RemoveContainer" containerID="5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.379020 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.400561 4816 scope.go:117] "RemoveContainer" containerID="a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.416734 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207\": container with ID starting with a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207 not found: ID does not exist" containerID="a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.416819 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207"} err="failed to get container status \"a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207\": rpc error: code = NotFound desc = could not find container \"a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207\": container with ID starting with a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207 not found: ID does not exist" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.416851 4816 scope.go:117] "RemoveContainer" containerID="5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.417943 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85\": container with ID starting with 5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85 not found: ID does not exist" containerID="5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.418034 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85"} err="failed to get container status \"5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85\": rpc error: code = NotFound desc = could not find container \"5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85\": container with ID starting with 5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85 not found: ID does not exist" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.418067 4816 scope.go:117] "RemoveContainer" containerID="9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.422710 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.432775 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.432877 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c94c19c-3ccb-43cc-ab41-92baa3141f73-logs\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.432931 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.432961 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data-custom\") pod \"7795071e-2de0-43cb-b225-cfed54570d94\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.433105 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c94c19c-3ccb-43cc-ab41-92baa3141f73-etc-machine-id\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.433199 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-public-tls-certs\") pod \"7795071e-2de0-43cb-b225-cfed54570d94\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.433232 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data-custom\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.434347 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c94c19c-3ccb-43cc-ab41-92baa3141f73-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.435332 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c94c19c-3ccb-43cc-ab41-92baa3141f73-logs" (OuterVolumeSpecName: "logs") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.433328 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-scripts\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.444003 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7795071e-2de0-43cb-b225-cfed54570d94-logs\") pod \"7795071e-2de0-43cb-b225-cfed54570d94\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454586 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5svs\" (UniqueName: \"kubernetes.io/projected/1c94c19c-3ccb-43cc-ab41-92baa3141f73-kube-api-access-g5svs\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454685 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data\") pod \"7795071e-2de0-43cb-b225-cfed54570d94\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454713 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-public-tls-certs\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454751 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgm8j\" (UniqueName: \"kubernetes.io/projected/7795071e-2de0-43cb-b225-cfed54570d94-kube-api-access-mgm8j\") pod \"7795071e-2de0-43cb-b225-cfed54570d94\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454775 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-combined-ca-bundle\") pod \"7795071e-2de0-43cb-b225-cfed54570d94\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454841 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-internal-tls-certs\") pod \"7795071e-2de0-43cb-b225-cfed54570d94\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454872 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-combined-ca-bundle\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454929 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-internal-tls-certs\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.455664 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.455771 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7795071e-2de0-43cb-b225-cfed54570d94-logs" (OuterVolumeSpecName: "logs") pod "7795071e-2de0-43cb-b225-cfed54570d94" (UID: "7795071e-2de0-43cb-b225-cfed54570d94"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.455844 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.455989 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c94c19c-3ccb-43cc-ab41-92baa3141f73-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.456055 4816 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c94c19c-3ccb-43cc-ab41-92baa3141f73-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.455947 4816 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.456238 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts podName:1da70aee-e1eb-4ad5-b0de-1e2f988dd729 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:28.456213725 +0000 UTC m=+1435.047477692 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts") pod "keystone-9b21-account-create-update-cmcfl" (UID: "1da70aee-e1eb-4ad5-b0de-1e2f988dd729") : configmap "openstack-scripts" not found Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.456668 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bd939d8-3b22-4496-acea-ac527f3e5149" (UID: "7bd939d8-3b22-4496-acea-ac527f3e5149"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.459197 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.460613 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7795071e-2de0-43cb-b225-cfed54570d94" (UID: "7795071e-2de0-43cb-b225-cfed54570d94"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.471035 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.474018 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.474575 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4bcf-account-create-update-nv5hk"] Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.478554 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.485319 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.488092 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.490194 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.490424 4816 scope.go:117] "RemoveContainer" containerID="c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.491583 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.491687 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" containerName="nova-cell0-conductor-conductor" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.499560 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-scripts" (OuterVolumeSpecName: "scripts") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.500138 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c94c19c-3ccb-43cc-ab41-92baa3141f73-kube-api-access-g5svs" (OuterVolumeSpecName: "kube-api-access-g5svs") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "kube-api-access-g5svs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.501670 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4bcf-account-create-update-nv5hk"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.514000 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.519274 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7795071e-2de0-43cb-b225-cfed54570d94-kube-api-access-mgm8j" (OuterVolumeSpecName: "kube-api-access-mgm8j") pod "7795071e-2de0-43cb-b225-cfed54570d94" (UID: "7795071e-2de0-43cb-b225-cfed54570d94"). InnerVolumeSpecName "kube-api-access-mgm8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.526070 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.545596 4816 scope.go:117] "RemoveContainer" containerID="9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.558979 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764\": container with ID starting with 9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764 not found: ID does not exist" containerID="9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.559063 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764"} err="failed to get container status \"9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764\": rpc error: code = NotFound desc = could not find container \"9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764\": container with ID starting with 9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764 not found: ID does not exist" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.559113 4816 scope.go:117] "RemoveContainer" containerID="c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.559774 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-sg-core-conf-yaml\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.559847 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79e89c6-5f56-4439-ad63-a86259d4ed29-logs\") pod \"b79e89c6-5f56-4439-ad63-a86259d4ed29\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.559950 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-scripts\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560012 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-run-httpd\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560070 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr9hs\" (UniqueName: \"kubernetes.io/projected/7457f2db-7979-4d92-bd90-a1464b8a3878-kube-api-access-rr9hs\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560103 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-public-tls-certs\") pod \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560142 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-combined-ca-bundle\") pod \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560177 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data-custom\") pod \"b79e89c6-5f56-4439-ad63-a86259d4ed29\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560266 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-ceilometer-tls-certs\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560297 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqhs8\" (UniqueName: \"kubernetes.io/projected/d28745d2-082d-4c99-90f0-b6c4696fb1a2-kube-api-access-cqhs8\") pod \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560323 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-internal-tls-certs\") pod \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560362 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-combined-ca-bundle\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560390 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-combined-ca-bundle\") pod \"b79e89c6-5f56-4439-ad63-a86259d4ed29\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560420 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-log-httpd\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560474 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-scripts\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560521 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klbjm\" (UniqueName: \"kubernetes.io/projected/2d60557e-d939-46bf-8a60-641016b4d68d-kube-api-access-klbjm\") pod \"2d60557e-d939-46bf-8a60-641016b4d68d\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560562 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d60557e-d939-46bf-8a60-641016b4d68d-operator-scripts\") pod \"2d60557e-d939-46bf-8a60-641016b4d68d\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560657 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-config-data\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560709 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-combined-ca-bundle\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560743 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-config-data\") pod \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560780 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-config-data\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560812 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5r6l\" (UniqueName: \"kubernetes.io/projected/bedb612d-0e22-4025-9151-d0cf7bc4ee42-kube-api-access-g5r6l\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560838 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-httpd-run\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560866 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data\") pod \"b79e89c6-5f56-4439-ad63-a86259d4ed29\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560925 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d28745d2-082d-4c99-90f0-b6c4696fb1a2-logs\") pod \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561001 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561044 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-logs\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561134 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4nrz\" (UniqueName: \"kubernetes.io/projected/b79e89c6-5f56-4439-ad63-a86259d4ed29-kube-api-access-v4nrz\") pod \"b79e89c6-5f56-4439-ad63-a86259d4ed29\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561188 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-public-tls-certs\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561529 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24z58\" (UniqueName: \"kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561810 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgm8j\" (UniqueName: \"kubernetes.io/projected/7795071e-2de0-43cb-b225-cfed54570d94-kube-api-access-mgm8j\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561840 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561855 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561868 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561880 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561934 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7795071e-2de0-43cb-b225-cfed54570d94-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561950 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5svs\" (UniqueName: \"kubernetes.io/projected/1c94c19c-3ccb-43cc-ab41-92baa3141f73-kube-api-access-g5svs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.562171 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.562211 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.562330 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data podName:26aea2df-f497-478d-b953-060189ef2569 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:34.56223497 +0000 UTC m=+1441.153498937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data") pod "rabbitmq-server-0" (UID: "26aea2df-f497-478d-b953-060189ef2569") : configmap "rabbitmq-config-data" not found Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.567328 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d60557e-d939-46bf-8a60-641016b4d68d-kube-api-access-klbjm" (OuterVolumeSpecName: "kube-api-access-klbjm") pod "2d60557e-d939-46bf-8a60-641016b4d68d" (UID: "2d60557e-d939-46bf-8a60-641016b4d68d"). InnerVolumeSpecName "kube-api-access-klbjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.560008 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25\": container with ID starting with c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25 not found: ID does not exist" containerID="c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.567401 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25"} err="failed to get container status \"c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25\": rpc error: code = NotFound desc = could not find container \"c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25\": container with ID starting with c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25 not found: ID does not exist" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.567437 4816 scope.go:117] "RemoveContainer" containerID="05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.569086 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d60557e-d939-46bf-8a60-641016b4d68d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d60557e-d939-46bf-8a60-641016b4d68d" (UID: "2d60557e-d939-46bf-8a60-641016b4d68d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.576111 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b79e89c6-5f56-4439-ad63-a86259d4ed29-logs" (OuterVolumeSpecName: "logs") pod "b79e89c6-5f56-4439-ad63-a86259d4ed29" (UID: "b79e89c6-5f56-4439-ad63-a86259d4ed29"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.585795 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.585890 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d28745d2-082d-4c99-90f0-b6c4696fb1a2-logs" (OuterVolumeSpecName: "logs") pod "d28745d2-082d-4c99-90f0-b6c4696fb1a2" (UID: "d28745d2-082d-4c99-90f0-b6c4696fb1a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.586028 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-snf5b"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.591235 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-logs" (OuterVolumeSpecName: "logs") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.592908 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-scripts" (OuterVolumeSpecName: "scripts") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.595013 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-snf5b"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.595037 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.600956 4816 projected.go:194] Error preparing data for projected volume kube-api-access-24z58 for pod openstack/keystone-9b21-account-create-update-cmcfl: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.601048 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58 podName:1da70aee-e1eb-4ad5-b0de-1e2f988dd729 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:28.601021324 +0000 UTC m=+1435.192285351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-24z58" (UniqueName: "kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58") pod "keystone-9b21-account-create-update-cmcfl" (UID: "1da70aee-e1eb-4ad5-b0de-1e2f988dd729") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.611691 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.618021 4816 scope.go:117] "RemoveContainer" containerID="494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.622448 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.630514 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-scripts" (OuterVolumeSpecName: "scripts") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.630784 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d28745d2-082d-4c99-90f0-b6c4696fb1a2-kube-api-access-cqhs8" (OuterVolumeSpecName: "kube-api-access-cqhs8") pod "d28745d2-082d-4c99-90f0-b6c4696fb1a2" (UID: "d28745d2-082d-4c99-90f0-b6c4696fb1a2"). InnerVolumeSpecName "kube-api-access-cqhs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.636539 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b79e89c6-5f56-4439-ad63-a86259d4ed29-kube-api-access-v4nrz" (OuterVolumeSpecName: "kube-api-access-v4nrz") pod "b79e89c6-5f56-4439-ad63-a86259d4ed29" (UID: "b79e89c6-5f56-4439-ad63-a86259d4ed29"). InnerVolumeSpecName "kube-api-access-v4nrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.637293 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.638631 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bedb612d-0e22-4025-9151-d0cf7bc4ee42-kube-api-access-g5r6l" (OuterVolumeSpecName: "kube-api-access-g5r6l") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "kube-api-access-g5r6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.640620 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.643452 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b79e89c6-5f56-4439-ad63-a86259d4ed29" (UID: "b79e89c6-5f56-4439-ad63-a86259d4ed29"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.648470 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664322 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79e89c6-5f56-4439-ad63-a86259d4ed29-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664365 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664468 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664577 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664593 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqhs8\" (UniqueName: \"kubernetes.io/projected/d28745d2-082d-4c99-90f0-b6c4696fb1a2-kube-api-access-cqhs8\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664607 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664620 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664632 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klbjm\" (UniqueName: \"kubernetes.io/projected/2d60557e-d939-46bf-8a60-641016b4d68d-kube-api-access-klbjm\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664643 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d60557e-d939-46bf-8a60-641016b4d68d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664655 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5r6l\" (UniqueName: \"kubernetes.io/projected/bedb612d-0e22-4025-9151-d0cf7bc4ee42-kube-api-access-g5r6l\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664666 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664677 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d28745d2-082d-4c99-90f0-b6c4696fb1a2-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664968 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664990 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.665003 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4nrz\" (UniqueName: \"kubernetes.io/projected/b79e89c6-5f56-4439-ad63-a86259d4ed29-kube-api-access-v4nrz\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.669908 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7457f2db-7979-4d92-bd90-a1464b8a3878-kube-api-access-rr9hs" (OuterVolumeSpecName: "kube-api-access-rr9hs") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "kube-api-access-rr9hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.692088 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7795071e-2de0-43cb-b225-cfed54570d94" (UID: "7795071e-2de0-43cb-b225-cfed54570d94"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.693043 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.693264 4816 scope.go:117] "RemoveContainer" containerID="05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.695924 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2\": container with ID starting with 05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2 not found: ID does not exist" containerID="05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.695963 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2"} err="failed to get container status \"05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2\": rpc error: code = NotFound desc = could not find container \"05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2\": container with ID starting with 05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2 not found: ID does not exist" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.695987 4816 scope.go:117] "RemoveContainer" containerID="494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.699683 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191\": container with ID starting with 494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191 not found: ID does not exist" containerID="494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.699927 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191"} err="failed to get container status \"494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191\": rpc error: code = NotFound desc = could not find container \"494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191\": container with ID starting with 494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191 not found: ID does not exist" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.699975 4816 scope.go:117] "RemoveContainer" containerID="87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.705487 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.718153 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.737612 4816 scope.go:117] "RemoveContainer" containerID="87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.738205 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c\": container with ID starting with 87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c not found: ID does not exist" containerID="87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.738342 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c"} err="failed to get container status \"87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c\": rpc error: code = NotFound desc = could not find container \"87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c\": container with ID starting with 87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c not found: ID does not exist" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.738392 4816 scope.go:117] "RemoveContainer" containerID="6309388e250c5434fd6b39ddce96cacd594c9880dd57d2c9e89074cac30a961b" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.767234 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr9hs\" (UniqueName: \"kubernetes.io/projected/7457f2db-7979-4d92-bd90-a1464b8a3878-kube-api-access-rr9hs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.767301 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.767313 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.779073 4816 scope.go:117] "RemoveContainer" containerID="3acd68e155620ecc4260fb5ba2dfe8af8d211b5066fc4c67c7f8658e47beb43f" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.805558 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d28745d2-082d-4c99-90f0-b6c4696fb1a2" (UID: "d28745d2-082d-4c99-90f0-b6c4696fb1a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.816434 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.834406 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7795071e-2de0-43cb-b225-cfed54570d94" (UID: "7795071e-2de0-43cb-b225-cfed54570d94"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.868321 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5ffd6fb588-7hftz"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.869240 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.871845 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.871865 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.871773 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.879841 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5ffd6fb588-7hftz"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.890096 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-config-data" (OuterVolumeSpecName: "config-data") pod "d28745d2-082d-4c99-90f0-b6c4696fb1a2" (UID: "d28745d2-082d-4c99-90f0-b6c4696fb1a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.919726 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.924540 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.925646 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.927153 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.927246 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="da177cde-6332-4562-809a-d4bee453cebf" containerName="galera" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.929270 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7795071e-2de0-43cb-b225-cfed54570d94" (UID: "7795071e-2de0-43cb-b225-cfed54570d94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.972349 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.974441 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.974471 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.974484 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.974494 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.974507 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.003868 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.005762 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b79e89c6-5f56-4439-ad63-a86259d4ed29" (UID: "b79e89c6-5f56-4439-ad63-a86259d4ed29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.013093 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data" (OuterVolumeSpecName: "config-data") pod "7795071e-2de0-43cb-b225-cfed54570d94" (UID: "7795071e-2de0-43cb-b225-cfed54570d94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.017272 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d28745d2-082d-4c99-90f0-b6c4696fb1a2" (UID: "d28745d2-082d-4c99-90f0-b6c4696fb1a2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.023896 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.024321 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.032571 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-config-data" (OuterVolumeSpecName: "config-data") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.036898 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.039162 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data" (OuterVolumeSpecName: "config-data") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.053776 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d28745d2-082d-4c99-90f0-b6c4696fb1a2" (UID: "d28745d2-082d-4c99-90f0-b6c4696fb1a2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076231 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076276 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076288 4816 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076297 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076306 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076316 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076324 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076332 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076341 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076349 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.084260 4816 generic.go:334] "Generic (PLEG): container finished" podID="63567eba-cc2a-4168-9e81-51c1daed5482" containerID="adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6" exitCode=0 Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.084508 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"63567eba-cc2a-4168-9e81-51c1daed5482","Type":"ContainerDied","Data":"adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.084786 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"63567eba-cc2a-4168-9e81-51c1daed5482","Type":"ContainerDied","Data":"188caecd38c19e3561f318ca76a8032bcaad31be23f5090529c90fb8dfd7f7e7"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.084865 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="188caecd38c19e3561f318ca76a8032bcaad31be23f5090529c90fb8dfd7f7e7" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.085448 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data" (OuterVolumeSpecName: "config-data") pod "b79e89c6-5f56-4439-ad63-a86259d4ed29" (UID: "b79e89c6-5f56-4439-ad63-a86259d4ed29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.095620 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.095645 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerDied","Data":"c1f12afb3ed2335d5b28ac089b50b4a7d4f0e38f3d3c1e7e1f537108eabd58b9"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.098289 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-config-data" (OuterVolumeSpecName: "config-data") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.109846 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bvvkj" event={"ID":"2d60557e-d939-46bf-8a60-641016b4d68d","Type":"ContainerDied","Data":"8ea9826afd6446a559af78c72ed8d7f368b8a030b60ae6f7af907a7806773c5c"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.109906 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.112496 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7457f2db-7979-4d92-bd90-a1464b8a3878","Type":"ContainerDied","Data":"722d37999c6fc7f3ffe4d8bb991503dcb67968fd32e0b13507be34c65c4fb635"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.112696 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.135445 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-855897fd55-t7sfb" event={"ID":"b79e89c6-5f56-4439-ad63-a86259d4ed29","Type":"ContainerDied","Data":"65e8dd7e6335c0228a44e94f23c28e5cede1dd965bd20e6b4cf61bc69bb5386a"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.135514 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.150934 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5030028c-f574-4334-a837-2430761524b4","Type":"ContainerDied","Data":"0b4c4c1c298f57878044bac49cc49a719acfc3a0f87a1803c19c539d85446637"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.150958 4816 generic.go:334] "Generic (PLEG): container finished" podID="5030028c-f574-4334-a837-2430761524b4" containerID="0b4c4c1c298f57878044bac49cc49a719acfc3a0f87a1803c19c539d85446637" exitCode=0 Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.151139 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.151129 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5030028c-f574-4334-a837-2430761524b4","Type":"ContainerDied","Data":"6419a001ec72fffb18fae89ec5268f12610ae0c656da26d1ec1980d99bf8c731"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.151191 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.151198 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.151199 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6419a001ec72fffb18fae89ec5268f12610ae0c656da26d1ec1980d99bf8c731" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.151925 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.179805 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.179830 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.231797 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.260417 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.309565 4816 scope.go:117] "RemoveContainer" containerID="08358819a244a822957b7c7153f37ef3fa2c0371fe913be221e0cf6e09e89054" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.331009 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.337179 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.370534 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bvvkj"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.371251 4816 scope.go:117] "RemoveContainer" containerID="90224f5e31cd4408489a5dec30ffa77147f611b179c23e40a3d0104504542a1b" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.380938 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bvvkj"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382148 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7gcd\" (UniqueName: \"kubernetes.io/projected/5030028c-f574-4334-a837-2430761524b4-kube-api-access-d7gcd\") pod \"5030028c-f574-4334-a837-2430761524b4\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382179 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-kolla-config\") pod \"5030028c-f574-4334-a837-2430761524b4\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382250 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-combined-ca-bundle\") pod \"5030028c-f574-4334-a837-2430761524b4\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382409 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-combined-ca-bundle\") pod \"63567eba-cc2a-4168-9e81-51c1daed5482\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382568 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-memcached-tls-certs\") pod \"5030028c-f574-4334-a837-2430761524b4\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382646 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcvt9\" (UniqueName: \"kubernetes.io/projected/63567eba-cc2a-4168-9e81-51c1daed5482-kube-api-access-jcvt9\") pod \"63567eba-cc2a-4168-9e81-51c1daed5482\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382682 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-config-data\") pod \"5030028c-f574-4334-a837-2430761524b4\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382727 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-config-data\") pod \"63567eba-cc2a-4168-9e81-51c1daed5482\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.383194 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "5030028c-f574-4334-a837-2430761524b4" (UID: "5030028c-f574-4334-a837-2430761524b4"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.384314 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-config-data" (OuterVolumeSpecName: "config-data") pod "5030028c-f574-4334-a837-2430761524b4" (UID: "5030028c-f574-4334-a837-2430761524b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.458283 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5030028c-f574-4334-a837-2430761524b4-kube-api-access-d7gcd" (OuterVolumeSpecName: "kube-api-access-d7gcd") pod "5030028c-f574-4334-a837-2430761524b4" (UID: "5030028c-f574-4334-a837-2430761524b4"). InnerVolumeSpecName "kube-api-access-d7gcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.460639 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63567eba-cc2a-4168-9e81-51c1daed5482-kube-api-access-jcvt9" (OuterVolumeSpecName: "kube-api-access-jcvt9") pod "63567eba-cc2a-4168-9e81-51c1daed5482" (UID: "63567eba-cc2a-4168-9e81-51c1daed5482"). InnerVolumeSpecName "kube-api-access-jcvt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.473563 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-config-data" (OuterVolumeSpecName: "config-data") pod "63567eba-cc2a-4168-9e81-51c1daed5482" (UID: "63567eba-cc2a-4168-9e81-51c1daed5482"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.487484 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcvt9\" (UniqueName: \"kubernetes.io/projected/63567eba-cc2a-4168-9e81-51c1daed5482-kube-api-access-jcvt9\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.487517 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.487529 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.487541 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7gcd\" (UniqueName: \"kubernetes.io/projected/5030028c-f574-4334-a837-2430761524b4-kube-api-access-d7gcd\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.487552 4816 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.531519 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9b21-account-create-update-cmcfl"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.553778 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63567eba-cc2a-4168-9e81-51c1daed5482" (UID: "63567eba-cc2a-4168-9e81-51c1daed5482"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.560185 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "5030028c-f574-4334-a837-2430761524b4" (UID: "5030028c-f574-4334-a837-2430761524b4"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.589730 4816 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.589767 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.615487 4816 scope.go:117] "RemoveContainer" containerID="a80048ca909856187d3fa5dac7b542ba5ca3c8dbcb582537e0f884c753db4809" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.630395 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5030028c-f574-4334-a837-2430761524b4" (UID: "5030028c-f574-4334-a837-2430761524b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.679087 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9b21-account-create-update-cmcfl"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.691187 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.696972 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.706373 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.710607 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-855897fd55-t7sfb"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.719342 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-855897fd55-t7sfb"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.728388 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-64b59f8d4-2vxd9"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.733149 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-64b59f8d4-2vxd9"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.744326 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.751439 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.760877 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: E0311 12:22:27.765057 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.774925 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.794727 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24z58\" (UniqueName: \"kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.794765 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: E0311 12:22:27.802462 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:22:27 crc kubenswrapper[4816]: E0311 12:22:27.829058 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:22:27 crc kubenswrapper[4816]: E0311 12:22:27.829152 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="41f4b502-b85f-488c-b55b-27a31479df68" containerName="nova-scheduler-scheduler" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.875382 4816 scope.go:117] "RemoveContainer" containerID="824fd644293ef663ba362cace1b788aa52143866b3de49d3b2f15202714957b5" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.980150 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.023987 4816 scope.go:117] "RemoveContainer" containerID="a8b3b1241d87a2bc94cda4c45011262eeb879b9fb212362f754599d92ce27242" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.098366 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-combined-ca-bundle\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.099379 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-galera-tls-certs\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.099529 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-operator-scripts\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.099608 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-kolla-config\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.100564 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.104175 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.104340 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.099639 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-config-data-default\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.104543 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txrmx\" (UniqueName: \"kubernetes.io/projected/da177cde-6332-4562-809a-d4bee453cebf-kube-api-access-txrmx\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.104624 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/da177cde-6332-4562-809a-d4bee453cebf-config-data-generated\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.104682 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.105274 4816 scope.go:117] "RemoveContainer" containerID="84c64e2c11b5a33088d3e50d684b62246b9937fb898429fa525cc6fb739d9015" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.105807 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da177cde-6332-4562-809a-d4bee453cebf-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.107120 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.107197 4816 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.107210 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.107222 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/da177cde-6332-4562-809a-d4bee453cebf-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.109121 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c71feeeb-a44d-42ec-a4c7-ddbf9a76f825/ovn-northd/0.log" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.109273 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.113600 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da177cde-6332-4562-809a-d4bee453cebf-kube-api-access-txrmx" (OuterVolumeSpecName: "kube-api-access-txrmx") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "kube-api-access-txrmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.138451 4816 scope.go:117] "RemoveContainer" containerID="8ba3c9d212f5a9f10887e454eabe42340558258c07c8285eb982b69803aa3749" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.151698 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.165027 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" path="/var/lib/kubelet/pods/1c94c19c-3ccb-43cc-ab41-92baa3141f73/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.165963 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1da70aee-e1eb-4ad5-b0de-1e2f988dd729" path="/var/lib/kubelet/pods/1da70aee-e1eb-4ad5-b0de-1e2f988dd729/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.166440 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d60557e-d939-46bf-8a60-641016b4d68d" path="/var/lib/kubelet/pods/2d60557e-d939-46bf-8a60-641016b4d68d/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.167166 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32dcc96b-186a-444d-bef3-4c5f117ee652" path="/var/lib/kubelet/pods/32dcc96b-186a-444d-bef3-4c5f117ee652/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.168497 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb" path="/var/lib/kubelet/pods/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.169129 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" path="/var/lib/kubelet/pods/7457f2db-7979-4d92-bd90-a1464b8a3878/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.170062 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.170371 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7795071e-2de0-43cb-b225-cfed54570d94" path="/var/lib/kubelet/pods/7795071e-2de0-43cb-b225-cfed54570d94/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.171929 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" path="/var/lib/kubelet/pods/7bd939d8-3b22-4496-acea-ac527f3e5149/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.172870 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" path="/var/lib/kubelet/pods/7d73d9d0-5632-47a3-93e0-899f64f51011/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.174016 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a22173f-147b-46ac-bb01-596fe9f12b10" path="/var/lib/kubelet/pods/9a22173f-147b-46ac-bb01-596fe9f12b10/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.177636 4816 generic.go:334] "Generic (PLEG): container finished" podID="26aea2df-f497-478d-b953-060189ef2569" containerID="0735cf7e4268f5297289dcfc433ce805028b2098230211ba63ceb121fac25ec7" exitCode=0 Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.178741 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1dd25da-51d6-45f0-b70c-f1baa17d2da3" path="/var/lib/kubelet/pods/b1dd25da-51d6-45f0-b70c-f1baa17d2da3/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.180333 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" path="/var/lib/kubelet/pods/b79e89c6-5f56-4439-ad63-a86259d4ed29/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.180902 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c71feeeb-a44d-42ec-a4c7-ddbf9a76f825/ovn-northd/0.log" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.180929 4816 generic.go:334] "Generic (PLEG): container finished" podID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerID="8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" exitCode=139 Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.181010 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.181299 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" path="/var/lib/kubelet/pods/bedb612d-0e22-4025-9151-d0cf7bc4ee42/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.190462 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.190508 4816 scope.go:117] "RemoveContainer" containerID="c020c8caff09b112c5e61167611361a425a1b4a92367fbbd7dbf97390e021cca" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.191038 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" path="/var/lib/kubelet/pods/d28745d2-082d-4c99-90f0-b6c4696fb1a2/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.193033 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" path="/var/lib/kubelet/pods/e95ddca0-76d0-4dce-9983-4b07655adc25/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.205999 4816 generic.go:334] "Generic (PLEG): container finished" podID="da177cde-6332-4562-809a-d4bee453cebf" containerID="c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" exitCode=0 Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.206588 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.206200 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26aea2df-f497-478d-b953-060189ef2569","Type":"ContainerDied","Data":"0735cf7e4268f5297289dcfc433ce805028b2098230211ba63ceb121fac25ec7"} Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.207428 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825","Type":"ContainerDied","Data":"8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7"} Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.207466 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825","Type":"ContainerDied","Data":"25d4f9ece0205331680bd83d3d312fa201b0497bc9a8a61346652664c99b99e2"} Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.207479 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"da177cde-6332-4562-809a-d4bee453cebf","Type":"ContainerDied","Data":"c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492"} Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.207492 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"da177cde-6332-4562-809a-d4bee453cebf","Type":"ContainerDied","Data":"c1304c6acbe0151fcfd1f27a9fb0f616c29bb18a4876bb3def66924a603536ea"} Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.207603 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.211499 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.211915 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-northd-tls-certs\") pod \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.213510 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-scripts\") pod \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.214110 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hft4\" (UniqueName: \"kubernetes.io/projected/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-kube-api-access-7hft4\") pod \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.214333 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-rundir\") pod \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.214464 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-config\") pod \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.214571 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-combined-ca-bundle\") pod \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.214684 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-metrics-certs-tls-certs\") pod \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.215200 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.215648 4816 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.216111 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txrmx\" (UniqueName: \"kubernetes.io/projected/da177cde-6332-4562-809a-d4bee453cebf-kube-api-access-txrmx\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.216323 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.219279 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-scripts" (OuterVolumeSpecName: "scripts") pod "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" (UID: "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.220978 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" (UID: "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.227411 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-config" (OuterVolumeSpecName: "config") pod "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" (UID: "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.231993 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-kube-api-access-7hft4" (OuterVolumeSpecName: "kube-api-access-7hft4") pod "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" (UID: "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825"). InnerVolumeSpecName "kube-api-access-7hft4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.249715 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" (UID: "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.257763 4816 scope.go:117] "RemoveContainer" containerID="4e741a528a024acf7a27b5a7253bef28cff4a22ea41c625ba24158e8c7be76eb" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.261977 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.295767 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" (UID: "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.312074 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" (UID: "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.319231 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.319510 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hft4\" (UniqueName: \"kubernetes.io/projected/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-kube-api-access-7hft4\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.319597 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.319826 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.320104 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.320806 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.321018 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.321191 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.361198 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.366837 4816 scope.go:117] "RemoveContainer" containerID="f675def681ebf7bc955ad7437f5bae6532f22f4db4a832aa48a182650e749af2" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.376476 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.394451 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.418202 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.428600 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.437839 4816 scope.go:117] "RemoveContainer" containerID="6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.482904 4816 scope.go:117] "RemoveContainer" containerID="8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.526145 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26aea2df-f497-478d-b953-060189ef2569-erlang-cookie-secret\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.526229 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-erlang-cookie\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.526506 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-tls\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.526543 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-confd\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.526568 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-plugins\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: E0311 12:22:28.518944 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63567eba_cc2a_4168_9e81_51c1daed5482.slice/crio-188caecd38c19e3561f318ca76a8032bcaad31be23f5090529c90fb8dfd7f7e7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda177cde_6332_4562_809a_d4bee453cebf.slice/crio-c1304c6acbe0151fcfd1f27a9fb0f616c29bb18a4876bb3def66924a603536ea\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5030028c_f574_4334_a837_2430761524b4.slice/crio-6419a001ec72fffb18fae89ec5268f12610ae0c656da26d1ec1980d99bf8c731\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c180505_72c6_498d_bfa5_05f689692bd2.slice/crio-40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5030028c_f574_4334_a837_2430761524b4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c180505_72c6_498d_bfa5_05f689692bd2.slice/crio-conmon-40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63567eba_cc2a_4168_9e81_51c1daed5482.slice\": RecentStats: unable to find data in memory cache]" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.529073 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.529771 4816 scope.go:117] "RemoveContainer" containerID="6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.526610 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.530093 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26aea2df-f497-478d-b953-060189ef2569-pod-info\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.530181 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.530369 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-server-conf\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.530472 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv8dl\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-kube-api-access-dv8dl\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.530510 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-plugins-conf\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.531241 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.532610 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.532942 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.538664 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: E0311 12:22:28.541072 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7\": container with ID starting with 6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7 not found: ID does not exist" containerID="6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.541190 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7"} err="failed to get container status \"6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7\": rpc error: code = NotFound desc = could not find container \"6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7\": container with ID starting with 6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7 not found: ID does not exist" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.541237 4816 scope.go:117] "RemoveContainer" containerID="8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" Mar 11 12:22:28 crc kubenswrapper[4816]: E0311 12:22:28.545626 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7\": container with ID starting with 8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7 not found: ID does not exist" containerID="8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.545688 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7"} err="failed to get container status \"8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7\": rpc error: code = NotFound desc = could not find container \"8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7\": container with ID starting with 8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7 not found: ID does not exist" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.545732 4816 scope.go:117] "RemoveContainer" containerID="c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.546798 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/26aea2df-f497-478d-b953-060189ef2569-pod-info" (OuterVolumeSpecName: "pod-info") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.554386 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.558918 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-kube-api-access-dv8dl" (OuterVolumeSpecName: "kube-api-access-dv8dl") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "kube-api-access-dv8dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.559516 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data" (OuterVolumeSpecName: "config-data") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.578972 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.579478 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26aea2df-f497-478d-b953-060189ef2569-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.585888 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.601583 4816 scope.go:117] "RemoveContainer" containerID="933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.602361 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-server-conf" (OuterVolumeSpecName: "server-conf") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.637822 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638274 4816 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26aea2df-f497-478d-b953-060189ef2569-pod-info\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638330 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638344 4816 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-server-conf\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638435 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv8dl\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-kube-api-access-dv8dl\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638459 4816 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638528 4816 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26aea2df-f497-478d-b953-060189ef2569-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638542 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638594 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.692561 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.693681 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.718588 4816 scope.go:117] "RemoveContainer" containerID="c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" Mar 11 12:22:28 crc kubenswrapper[4816]: E0311 12:22:28.719428 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492\": container with ID starting with c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492 not found: ID does not exist" containerID="c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.719498 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492"} err="failed to get container status \"c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492\": rpc error: code = NotFound desc = could not find container \"c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492\": container with ID starting with c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492 not found: ID does not exist" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.719566 4816 scope.go:117] "RemoveContainer" containerID="933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03" Mar 11 12:22:28 crc kubenswrapper[4816]: E0311 12:22:28.719888 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03\": container with ID starting with 933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03 not found: ID does not exist" containerID="933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.719916 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03"} err="failed to get container status \"933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03\": rpc error: code = NotFound desc = could not find container \"933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03\": container with ID starting with 933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03 not found: ID does not exist" Mar 11 12:22:28 crc kubenswrapper[4816]: E0311 12:22:28.741981 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:28 crc kubenswrapper[4816]: E0311 12:22:28.742111 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data podName:3779c0f5-9084-4c07-83d9-fe2017559f7b nodeName:}" failed. No retries permitted until 2026-03-11 12:22:36.742081389 +0000 UTC m=+1443.333345386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b") : configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.743679 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.743710 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.944562 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.052903 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-combined-ca-bundle\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.052973 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-fernet-keys\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.053033 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-public-tls-certs\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.053080 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-config-data\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.053143 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-credential-keys\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.053172 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-internal-tls-certs\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.053269 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zc8j\" (UniqueName: \"kubernetes.io/projected/9c180505-72c6-498d-bfa5-05f689692bd2-kube-api-access-6zc8j\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.053339 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-scripts\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.069113 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c180505-72c6-498d-bfa5-05f689692bd2-kube-api-access-6zc8j" (OuterVolumeSpecName: "kube-api-access-6zc8j") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "kube-api-access-6zc8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.078366 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-scripts" (OuterVolumeSpecName: "scripts") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.079026 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.079737 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.105187 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-config-data" (OuterVolumeSpecName: "config-data") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.117804 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.126519 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.156721 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zc8j\" (UniqueName: \"kubernetes.io/projected/9c180505-72c6-498d-bfa5-05f689692bd2-kube-api-access-6zc8j\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.156762 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.156773 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.156783 4816 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.156792 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.156803 4816 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.175093 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.185503 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.242944 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.242955 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26aea2df-f497-478d-b953-060189ef2569","Type":"ContainerDied","Data":"bd5f7144adb25f2d3d74b32cee4ef0069fc612e5f70830fc738cf8898c918056"} Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.243069 4816 scope.go:117] "RemoveContainer" containerID="0735cf7e4268f5297289dcfc433ce805028b2098230211ba63ceb121fac25ec7" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257583 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-confd\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257637 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3779c0f5-9084-4c07-83d9-fe2017559f7b-pod-info\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257659 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvr95\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-kube-api-access-mvr95\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257702 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-tls\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257722 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257751 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-erlang-cookie\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257809 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-plugins-conf\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257854 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-server-conf\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257901 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-plugins\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257930 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257965 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3779c0f5-9084-4c07-83d9-fe2017559f7b-erlang-cookie-secret\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.258359 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.258377 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.258960 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.260851 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.260905 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.266353 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3779c0f5-9084-4c07-83d9-fe2017559f7b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.267056 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.267157 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3779c0f5-9084-4c07-83d9-fe2017559f7b-pod-info" (OuterVolumeSpecName: "pod-info") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.268182 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.270993 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-kube-api-access-mvr95" (OuterVolumeSpecName: "kube-api-access-mvr95") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "kube-api-access-mvr95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.273572 4816 generic.go:334] "Generic (PLEG): container finished" podID="41f4b502-b85f-488c-b55b-27a31479df68" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" exitCode=0 Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.273675 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"41f4b502-b85f-488c-b55b-27a31479df68","Type":"ContainerDied","Data":"60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744"} Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.275189 4816 generic.go:334] "Generic (PLEG): container finished" podID="9c180505-72c6-498d-bfa5-05f689692bd2" containerID="40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8" exitCode=0 Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.275248 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d6ddcd789-qjf9c" event={"ID":"9c180505-72c6-498d-bfa5-05f689692bd2","Type":"ContainerDied","Data":"40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8"} Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.275290 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d6ddcd789-qjf9c" event={"ID":"9c180505-72c6-498d-bfa5-05f689692bd2","Type":"ContainerDied","Data":"210d5da4467eeb407cc3db147ba87bbb3dfcf68d3ca56b768383a1d9ec2cdc8a"} Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.275371 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.280465 4816 generic.go:334] "Generic (PLEG): container finished" podID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerID="18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f" exitCode=0 Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.280614 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3779c0f5-9084-4c07-83d9-fe2017559f7b","Type":"ContainerDied","Data":"18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f"} Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.280661 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3779c0f5-9084-4c07-83d9-fe2017559f7b","Type":"ContainerDied","Data":"c73f7e4d7f0f4588b80903c0c3810420cc3aeed26ba2c6224b092ad58bda611c"} Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.280734 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.283901 4816 generic.go:334] "Generic (PLEG): container finished" podID="f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" containerID="4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2" exitCode=0 Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.283934 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc","Type":"ContainerDied","Data":"4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2"} Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.317814 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.318933 4816 scope.go:117] "RemoveContainer" containerID="47287b2bd213321105c729d451b069f02c0e309af3b5c9c84b7b9c24acc1a5f3" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.321369 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.325525 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data" (OuterVolumeSpecName: "config-data") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.326997 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.333923 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.334113 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.345436 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.345737 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.345766 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.350553 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-server-conf" (OuterVolumeSpecName: "server-conf") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.359685 4816 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3779c0f5-9084-4c07-83d9-fe2017559f7b-pod-info\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.359872 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvr95\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-kube-api-access-mvr95\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.359950 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.360065 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.360194 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.360291 4816 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.360379 4816 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-server-conf\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.360451 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.360521 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.360601 4816 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3779c0f5-9084-4c07-83d9-fe2017559f7b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.370977 4816 scope.go:117] "RemoveContainer" containerID="40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8" Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.371132 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.371181 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.378214 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5d6ddcd789-qjf9c"] Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.381573 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.393397 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5d6ddcd789-qjf9c"] Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.396936 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.437773 4816 scope.go:117] "RemoveContainer" containerID="40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8" Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.439149 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8\": container with ID starting with 40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8 not found: ID does not exist" containerID="40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.439457 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8"} err="failed to get container status \"40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8\": rpc error: code = NotFound desc = could not find container \"40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8\": container with ID starting with 40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8 not found: ID does not exist" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.439491 4816 scope.go:117] "RemoveContainer" containerID="18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.463043 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.463096 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.465374 4816 scope.go:117] "RemoveContainer" containerID="522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.487977 4816 scope.go:117] "RemoveContainer" containerID="18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f" Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.488729 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f\": container with ID starting with 18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f not found: ID does not exist" containerID="18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.488766 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f"} err="failed to get container status \"18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f\": rpc error: code = NotFound desc = could not find container \"18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f\": container with ID starting with 18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f not found: ID does not exist" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.488802 4816 scope.go:117] "RemoveContainer" containerID="522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e" Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.489165 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e\": container with ID starting with 522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e not found: ID does not exist" containerID="522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.489190 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e"} err="failed to get container status \"522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e\": rpc error: code = NotFound desc = could not find container \"522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e\": container with ID starting with 522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e not found: ID does not exist" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.617537 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.624363 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.631975 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.665994 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-config-data\") pod \"41f4b502-b85f-488c-b55b-27a31479df68\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.666468 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvr8x\" (UniqueName: \"kubernetes.io/projected/41f4b502-b85f-488c-b55b-27a31479df68-kube-api-access-cvr8x\") pod \"41f4b502-b85f-488c-b55b-27a31479df68\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.666499 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-combined-ca-bundle\") pod \"41f4b502-b85f-488c-b55b-27a31479df68\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.669970 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.673232 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f4b502-b85f-488c-b55b-27a31479df68-kube-api-access-cvr8x" (OuterVolumeSpecName: "kube-api-access-cvr8x") pod "41f4b502-b85f-488c-b55b-27a31479df68" (UID: "41f4b502-b85f-488c-b55b-27a31479df68"). InnerVolumeSpecName "kube-api-access-cvr8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.703901 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-config-data" (OuterVolumeSpecName: "config-data") pod "41f4b502-b85f-488c-b55b-27a31479df68" (UID: "41f4b502-b85f-488c-b55b-27a31479df68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.706859 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41f4b502-b85f-488c-b55b-27a31479df68" (UID: "41f4b502-b85f-488c-b55b-27a31479df68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.768015 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-config-data\") pod \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.768072 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ml65\" (UniqueName: \"kubernetes.io/projected/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-kube-api-access-4ml65\") pod \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.768148 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-combined-ca-bundle\") pod \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.768527 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.768554 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvr8x\" (UniqueName: \"kubernetes.io/projected/41f4b502-b85f-488c-b55b-27a31479df68-kube-api-access-cvr8x\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.768567 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.771958 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-kube-api-access-4ml65" (OuterVolumeSpecName: "kube-api-access-4ml65") pod "f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" (UID: "f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc"). InnerVolumeSpecName "kube-api-access-4ml65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.794564 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" (UID: "f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.794878 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-config-data" (OuterVolumeSpecName: "config-data") pod "f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" (UID: "f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.870468 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.870516 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ml65\" (UniqueName: \"kubernetes.io/projected/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-kube-api-access-4ml65\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.870531 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.012788 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="32dcc96b-186a-444d-bef3-4c5f117ee652" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.200:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.141156 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26aea2df-f497-478d-b953-060189ef2569" path="/var/lib/kubelet/pods/26aea2df-f497-478d-b953-060189ef2569/volumes" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.142052 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" path="/var/lib/kubelet/pods/3779c0f5-9084-4c07-83d9-fe2017559f7b/volumes" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.143165 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5030028c-f574-4334-a837-2430761524b4" path="/var/lib/kubelet/pods/5030028c-f574-4334-a837-2430761524b4/volumes" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.143697 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63567eba-cc2a-4168-9e81-51c1daed5482" path="/var/lib/kubelet/pods/63567eba-cc2a-4168-9e81-51c1daed5482/volumes" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.144276 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c180505-72c6-498d-bfa5-05f689692bd2" path="/var/lib/kubelet/pods/9c180505-72c6-498d-bfa5-05f689692bd2/volumes" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.145567 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" path="/var/lib/kubelet/pods/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825/volumes" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.293874 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"41f4b502-b85f-488c-b55b-27a31479df68","Type":"ContainerDied","Data":"0b682ef5a1cffec39d115886fe70f340b2f710836dc8fb73d7380c331ca3d440"} Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.293939 4816 scope.go:117] "RemoveContainer" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.293996 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.305320 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc","Type":"ContainerDied","Data":"897a415294b966ad7eb32e075c662fce4ade523bc49b487efdfde948eb76f843"} Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.305419 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.320936 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.323348 4816 scope.go:117] "RemoveContainer" containerID="4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.325946 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.337957 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.346907 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.763415 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-64b59f8d4-2vxd9" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.763567 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-64b59f8d4-2vxd9" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: i/o timeout" Mar 11 12:22:32 crc kubenswrapper[4816]: I0311 12:22:32.142317 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f4b502-b85f-488c-b55b-27a31479df68" path="/var/lib/kubelet/pods/41f4b502-b85f-488c-b55b-27a31479df68/volumes" Mar 11 12:22:32 crc kubenswrapper[4816]: I0311 12:22:32.143359 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" path="/var/lib/kubelet/pods/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc/volumes" Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.323565 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.324027 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.324237 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.324286 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.325459 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.326585 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.327536 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.327574 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.274099 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.413186 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-ovndb-tls-certs\") pod \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.413567 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-combined-ca-bundle\") pod \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.413615 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-httpd-config\") pod \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.413641 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-config\") pod \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.413689 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7fr9\" (UniqueName: \"kubernetes.io/projected/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-kube-api-access-x7fr9\") pod \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.413834 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-public-tls-certs\") pod \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.413913 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-internal-tls-certs\") pod \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.416440 4816 generic.go:334] "Generic (PLEG): container finished" podID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerID="fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035" exitCode=0 Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.416489 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6867c6dbc5-lzgfd" event={"ID":"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46","Type":"ContainerDied","Data":"fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035"} Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.416524 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6867c6dbc5-lzgfd" event={"ID":"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46","Type":"ContainerDied","Data":"10129169327e9c40582f9c635a8d87b021f99cc78ac017f7e4f16f40942456bc"} Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.416558 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.416586 4816 scope.go:117] "RemoveContainer" containerID="d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.423659 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-kube-api-access-x7fr9" (OuterVolumeSpecName: "kube-api-access-x7fr9") pod "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" (UID: "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46"). InnerVolumeSpecName "kube-api-access-x7fr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.436708 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" (UID: "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.459998 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" (UID: "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.464657 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" (UID: "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.478017 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" (UID: "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.479178 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" (UID: "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.482340 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-config" (OuterVolumeSpecName: "config") pod "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" (UID: "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.517227 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.517286 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.517296 4816 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.517307 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.517320 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.517331 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.517341 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7fr9\" (UniqueName: \"kubernetes.io/projected/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-kube-api-access-x7fr9\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.540522 4816 scope.go:117] "RemoveContainer" containerID="fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.564056 4816 scope.go:117] "RemoveContainer" containerID="d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d" Mar 11 12:22:37 crc kubenswrapper[4816]: E0311 12:22:37.564756 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d\": container with ID starting with d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d not found: ID does not exist" containerID="d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.564808 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d"} err="failed to get container status \"d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d\": rpc error: code = NotFound desc = could not find container \"d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d\": container with ID starting with d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d not found: ID does not exist" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.564844 4816 scope.go:117] "RemoveContainer" containerID="fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035" Mar 11 12:22:37 crc kubenswrapper[4816]: E0311 12:22:37.565397 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035\": container with ID starting with fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035 not found: ID does not exist" containerID="fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.565480 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035"} err="failed to get container status \"fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035\": rpc error: code = NotFound desc = could not find container \"fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035\": container with ID starting with fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035 not found: ID does not exist" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.747680 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6867c6dbc5-lzgfd"] Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.757126 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6867c6dbc5-lzgfd"] Mar 11 12:22:38 crc kubenswrapper[4816]: I0311 12:22:38.146368 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" path="/var/lib/kubelet/pods/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46/volumes" Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.324956 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.325863 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.326002 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.326080 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.326105 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.337329 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.340094 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.340142 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.324200 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.325495 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.326162 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.326241 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.326512 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.328910 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.331578 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.331649 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:22:44 crc kubenswrapper[4816]: I0311 12:22:44.419798 4816 scope.go:117] "RemoveContainer" containerID="5d6df61e0b509a66b3346da65b74fba3a74851e8e005a57c5d0fba5a7957a438" Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.323060 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.324407 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.324407 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.324974 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.325038 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.326693 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.328224 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.328404 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.524648 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tnhfq_edc01aa4-013d-4d10-9f22-e5f319e6c1a3/ovs-vswitchd/0.log" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.526508 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.566773 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-log\") pod \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.567231 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7685\" (UniqueName: \"kubernetes.io/projected/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-kube-api-access-z7685\") pod \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.566897 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-log" (OuterVolumeSpecName: "var-log") pod "edc01aa4-013d-4d10-9f22-e5f319e6c1a3" (UID: "edc01aa4-013d-4d10-9f22-e5f319e6c1a3"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.567344 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-lib\") pod \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.567406 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-scripts\") pod \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.567442 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-etc-ovs\") pod \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.567513 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-run\") pod \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.567950 4816 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-log\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.568007 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-run" (OuterVolumeSpecName: "var-run") pod "edc01aa4-013d-4d10-9f22-e5f319e6c1a3" (UID: "edc01aa4-013d-4d10-9f22-e5f319e6c1a3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.568048 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "edc01aa4-013d-4d10-9f22-e5f319e6c1a3" (UID: "edc01aa4-013d-4d10-9f22-e5f319e6c1a3"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.568082 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-lib" (OuterVolumeSpecName: "var-lib") pod "edc01aa4-013d-4d10-9f22-e5f319e6c1a3" (UID: "edc01aa4-013d-4d10-9f22-e5f319e6c1a3"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.568523 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-scripts" (OuterVolumeSpecName: "scripts") pod "edc01aa4-013d-4d10-9f22-e5f319e6c1a3" (UID: "edc01aa4-013d-4d10-9f22-e5f319e6c1a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.575917 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-kube-api-access-z7685" (OuterVolumeSpecName: "kube-api-access-z7685") pod "edc01aa4-013d-4d10-9f22-e5f319e6c1a3" (UID: "edc01aa4-013d-4d10-9f22-e5f319e6c1a3"). InnerVolumeSpecName "kube-api-access-z7685". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.585549 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tnhfq_edc01aa4-013d-4d10-9f22-e5f319e6c1a3/ovs-vswitchd/0.log" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.586707 4816 generic.go:334] "Generic (PLEG): container finished" podID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" exitCode=137 Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.586766 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tnhfq" event={"ID":"edc01aa4-013d-4d10-9f22-e5f319e6c1a3","Type":"ContainerDied","Data":"9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005"} Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.586806 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tnhfq" event={"ID":"edc01aa4-013d-4d10-9f22-e5f319e6c1a3","Type":"ContainerDied","Data":"22c727583d6de2eec899c37134713c754f06d9d2f697ad226095e328238d230b"} Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.586833 4816 scope.go:117] "RemoveContainer" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.587036 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.623791 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tnhfq"] Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.626738 4816 scope.go:117] "RemoveContainer" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.629663 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-tnhfq"] Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.646046 4816 scope.go:117] "RemoveContainer" containerID="ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.669858 4816 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-lib\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.669900 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.669912 4816 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.669926 4816 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.669938 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7685\" (UniqueName: \"kubernetes.io/projected/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-kube-api-access-z7685\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.687223 4816 scope.go:117] "RemoveContainer" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" Mar 11 12:22:50 crc kubenswrapper[4816]: E0311 12:22:50.687864 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005\": container with ID starting with 9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005 not found: ID does not exist" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.687931 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005"} err="failed to get container status \"9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005\": rpc error: code = NotFound desc = could not find container \"9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005\": container with ID starting with 9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005 not found: ID does not exist" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.687961 4816 scope.go:117] "RemoveContainer" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" Mar 11 12:22:50 crc kubenswrapper[4816]: E0311 12:22:50.688440 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c\": container with ID starting with e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c not found: ID does not exist" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.688496 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c"} err="failed to get container status \"e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c\": rpc error: code = NotFound desc = could not find container \"e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c\": container with ID starting with e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c not found: ID does not exist" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.688537 4816 scope.go:117] "RemoveContainer" containerID="ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b" Mar 11 12:22:50 crc kubenswrapper[4816]: E0311 12:22:50.688873 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b\": container with ID starting with ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b not found: ID does not exist" containerID="ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.688909 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b"} err="failed to get container status \"ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b\": rpc error: code = NotFound desc = could not find container \"ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b\": container with ID starting with ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.229777 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.278645 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbb5r\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-kube-api-access-rbb5r\") pod \"485f9fbd-e0ca-472d-b97c-87c127253a96\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.278759 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"485f9fbd-e0ca-472d-b97c-87c127253a96\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.278799 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-cache\") pod \"485f9fbd-e0ca-472d-b97c-87c127253a96\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.278861 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485f9fbd-e0ca-472d-b97c-87c127253a96-combined-ca-bundle\") pod \"485f9fbd-e0ca-472d-b97c-87c127253a96\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.278887 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"485f9fbd-e0ca-472d-b97c-87c127253a96\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.278940 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-lock\") pod \"485f9fbd-e0ca-472d-b97c-87c127253a96\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.279486 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-cache" (OuterVolumeSpecName: "cache") pod "485f9fbd-e0ca-472d-b97c-87c127253a96" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.279823 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-lock" (OuterVolumeSpecName: "lock") pod "485f9fbd-e0ca-472d-b97c-87c127253a96" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.283739 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "485f9fbd-e0ca-472d-b97c-87c127253a96" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.285400 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "swift") pod "485f9fbd-e0ca-472d-b97c-87c127253a96" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.285609 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-kube-api-access-rbb5r" (OuterVolumeSpecName: "kube-api-access-rbb5r") pod "485f9fbd-e0ca-472d-b97c-87c127253a96" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96"). InnerVolumeSpecName "kube-api-access-rbb5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.311432 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.380662 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/594ad696-b727-4153-979f-d32ccdc1fe83-etc-machine-id\") pod \"594ad696-b727-4153-979f-d32ccdc1fe83\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.380772 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data-custom\") pod \"594ad696-b727-4153-979f-d32ccdc1fe83\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.380820 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/594ad696-b727-4153-979f-d32ccdc1fe83-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "594ad696-b727-4153-979f-d32ccdc1fe83" (UID: "594ad696-b727-4153-979f-d32ccdc1fe83"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.380934 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data\") pod \"594ad696-b727-4153-979f-d32ccdc1fe83\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381064 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-combined-ca-bundle\") pod \"594ad696-b727-4153-979f-d32ccdc1fe83\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381100 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-scripts\") pod \"594ad696-b727-4153-979f-d32ccdc1fe83\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381143 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmj7z\" (UniqueName: \"kubernetes.io/projected/594ad696-b727-4153-979f-d32ccdc1fe83-kube-api-access-bmj7z\") pod \"594ad696-b727-4153-979f-d32ccdc1fe83\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381453 4816 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/594ad696-b727-4153-979f-d32ccdc1fe83-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381473 4816 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381484 4816 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-cache\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381509 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381523 4816 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-lock\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381537 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbb5r\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-kube-api-access-rbb5r\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.383883 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "594ad696-b727-4153-979f-d32ccdc1fe83" (UID: "594ad696-b727-4153-979f-d32ccdc1fe83"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.385823 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-scripts" (OuterVolumeSpecName: "scripts") pod "594ad696-b727-4153-979f-d32ccdc1fe83" (UID: "594ad696-b727-4153-979f-d32ccdc1fe83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.386126 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594ad696-b727-4153-979f-d32ccdc1fe83-kube-api-access-bmj7z" (OuterVolumeSpecName: "kube-api-access-bmj7z") pod "594ad696-b727-4153-979f-d32ccdc1fe83" (UID: "594ad696-b727-4153-979f-d32ccdc1fe83"). InnerVolumeSpecName "kube-api-access-bmj7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.395359 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.412292 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "594ad696-b727-4153-979f-d32ccdc1fe83" (UID: "594ad696-b727-4153-979f-d32ccdc1fe83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.448012 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data" (OuterVolumeSpecName: "config-data") pod "594ad696-b727-4153-979f-d32ccdc1fe83" (UID: "594ad696-b727-4153-979f-d32ccdc1fe83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.482592 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.482995 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.483006 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.483015 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.483038 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.483048 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmj7z\" (UniqueName: \"kubernetes.io/projected/594ad696-b727-4153-979f-d32ccdc1fe83-kube-api-access-bmj7z\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.532818 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485f9fbd-e0ca-472d-b97c-87c127253a96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "485f9fbd-e0ca-472d-b97c-87c127253a96" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.585141 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485f9fbd-e0ca-472d-b97c-87c127253a96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.599444 4816 generic.go:334] "Generic (PLEG): container finished" podID="594ad696-b727-4153-979f-d32ccdc1fe83" containerID="b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac" exitCode=137 Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.599487 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"594ad696-b727-4153-979f-d32ccdc1fe83","Type":"ContainerDied","Data":"b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac"} Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.599522 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.599557 4816 scope.go:117] "RemoveContainer" containerID="4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.599544 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"594ad696-b727-4153-979f-d32ccdc1fe83","Type":"ContainerDied","Data":"883c96453eeb3dc398341c2c3b80a740484d91dd773b0fcfe0237a4112b6097a"} Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.607384 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e" exitCode=137 Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.607463 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e"} Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.607495 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"bc1ccba63ef105a914d68b8eed3c206cfda92b47e6236ce5828d528e3ceb9770"} Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.607592 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.626526 4816 scope.go:117] "RemoveContainer" containerID="b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.639583 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.649182 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.655848 4816 scope.go:117] "RemoveContainer" containerID="4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.656821 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8\": container with ID starting with 4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8 not found: ID does not exist" containerID="4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.656867 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8"} err="failed to get container status \"4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8\": rpc error: code = NotFound desc = could not find container \"4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8\": container with ID starting with 4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.656900 4816 scope.go:117] "RemoveContainer" containerID="b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.657131 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac\": container with ID starting with b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac not found: ID does not exist" containerID="b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.657162 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac"} err="failed to get container status \"b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac\": rpc error: code = NotFound desc = could not find container \"b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac\": container with ID starting with b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.657186 4816 scope.go:117] "RemoveContainer" containerID="4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.663212 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.670192 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.676179 4816 scope.go:117] "RemoveContainer" containerID="68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.694095 4816 scope.go:117] "RemoveContainer" containerID="cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.711282 4816 scope.go:117] "RemoveContainer" containerID="1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.729211 4816 scope.go:117] "RemoveContainer" containerID="25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.749427 4816 scope.go:117] "RemoveContainer" containerID="3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.767112 4816 scope.go:117] "RemoveContainer" containerID="fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.785322 4816 scope.go:117] "RemoveContainer" containerID="bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.803367 4816 scope.go:117] "RemoveContainer" containerID="a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.821132 4816 scope.go:117] "RemoveContainer" containerID="424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.848005 4816 scope.go:117] "RemoveContainer" containerID="9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.872492 4816 scope.go:117] "RemoveContainer" containerID="712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.890012 4816 scope.go:117] "RemoveContainer" containerID="acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.909431 4816 scope.go:117] "RemoveContainer" containerID="d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.928698 4816 scope.go:117] "RemoveContainer" containerID="e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.952181 4816 scope.go:117] "RemoveContainer" containerID="4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.953076 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e\": container with ID starting with 4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e not found: ID does not exist" containerID="4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.953144 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e"} err="failed to get container status \"4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e\": rpc error: code = NotFound desc = could not find container \"4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e\": container with ID starting with 4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.953190 4816 scope.go:117] "RemoveContainer" containerID="68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.953579 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2\": container with ID starting with 68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2 not found: ID does not exist" containerID="68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.953613 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2"} err="failed to get container status \"68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2\": rpc error: code = NotFound desc = could not find container \"68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2\": container with ID starting with 68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.953632 4816 scope.go:117] "RemoveContainer" containerID="cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.954354 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd\": container with ID starting with cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd not found: ID does not exist" containerID="cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.954414 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd"} err="failed to get container status \"cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd\": rpc error: code = NotFound desc = could not find container \"cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd\": container with ID starting with cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.954435 4816 scope.go:117] "RemoveContainer" containerID="1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.954867 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61\": container with ID starting with 1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61 not found: ID does not exist" containerID="1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.954897 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61"} err="failed to get container status \"1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61\": rpc error: code = NotFound desc = could not find container \"1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61\": container with ID starting with 1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.954915 4816 scope.go:117] "RemoveContainer" containerID="25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.955408 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477\": container with ID starting with 25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477 not found: ID does not exist" containerID="25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.955437 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477"} err="failed to get container status \"25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477\": rpc error: code = NotFound desc = could not find container \"25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477\": container with ID starting with 25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.955456 4816 scope.go:117] "RemoveContainer" containerID="3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.955759 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4\": container with ID starting with 3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4 not found: ID does not exist" containerID="3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.955793 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4"} err="failed to get container status \"3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4\": rpc error: code = NotFound desc = could not find container \"3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4\": container with ID starting with 3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.955813 4816 scope.go:117] "RemoveContainer" containerID="fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.956071 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db\": container with ID starting with fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db not found: ID does not exist" containerID="fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.956100 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db"} err="failed to get container status \"fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db\": rpc error: code = NotFound desc = could not find container \"fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db\": container with ID starting with fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.956117 4816 scope.go:117] "RemoveContainer" containerID="bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.956739 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190\": container with ID starting with bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190 not found: ID does not exist" containerID="bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.956816 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190"} err="failed to get container status \"bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190\": rpc error: code = NotFound desc = could not find container \"bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190\": container with ID starting with bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.956879 4816 scope.go:117] "RemoveContainer" containerID="a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.957335 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1\": container with ID starting with a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1 not found: ID does not exist" containerID="a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.957370 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1"} err="failed to get container status \"a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1\": rpc error: code = NotFound desc = could not find container \"a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1\": container with ID starting with a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.957393 4816 scope.go:117] "RemoveContainer" containerID="424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.957776 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf\": container with ID starting with 424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf not found: ID does not exist" containerID="424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.957826 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf"} err="failed to get container status \"424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf\": rpc error: code = NotFound desc = could not find container \"424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf\": container with ID starting with 424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.957863 4816 scope.go:117] "RemoveContainer" containerID="9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.958310 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000\": container with ID starting with 9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000 not found: ID does not exist" containerID="9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.958339 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000"} err="failed to get container status \"9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000\": rpc error: code = NotFound desc = could not find container \"9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000\": container with ID starting with 9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.958357 4816 scope.go:117] "RemoveContainer" containerID="712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.958646 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280\": container with ID starting with 712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280 not found: ID does not exist" containerID="712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.958678 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280"} err="failed to get container status \"712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280\": rpc error: code = NotFound desc = could not find container \"712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280\": container with ID starting with 712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.958699 4816 scope.go:117] "RemoveContainer" containerID="acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.958953 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898\": container with ID starting with acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898 not found: ID does not exist" containerID="acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.958985 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898"} err="failed to get container status \"acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898\": rpc error: code = NotFound desc = could not find container \"acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898\": container with ID starting with acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.959001 4816 scope.go:117] "RemoveContainer" containerID="d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.959361 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6\": container with ID starting with d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6 not found: ID does not exist" containerID="d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.959390 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6"} err="failed to get container status \"d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6\": rpc error: code = NotFound desc = could not find container \"d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6\": container with ID starting with d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.959413 4816 scope.go:117] "RemoveContainer" containerID="e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.959816 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d\": container with ID starting with e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d not found: ID does not exist" containerID="e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.959845 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d"} err="failed to get container status \"e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d\": rpc error: code = NotFound desc = could not find container \"e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d\": container with ID starting with e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d not found: ID does not exist" Mar 11 12:22:52 crc kubenswrapper[4816]: I0311 12:22:52.148223 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" path="/var/lib/kubelet/pods/485f9fbd-e0ca-472d-b97c-87c127253a96/volumes" Mar 11 12:22:52 crc kubenswrapper[4816]: I0311 12:22:52.151309 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" path="/var/lib/kubelet/pods/594ad696-b727-4153-979f-d32ccdc1fe83/volumes" Mar 11 12:22:52 crc kubenswrapper[4816]: I0311 12:22:52.152211 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" path="/var/lib/kubelet/pods/edc01aa4-013d-4d10-9f22-e5f319e6c1a3/volumes" Mar 11 12:22:58 crc kubenswrapper[4816]: I0311 12:22:58.350406 4816 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podda177cde-6332-4562-809a-d4bee453cebf"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podda177cde-6332-4562-809a-d4bee453cebf] : Timed out while waiting for systemd to remove kubepods-besteffort-podda177cde_6332_4562_809a_d4bee453cebf.slice" Mar 11 12:22:58 crc kubenswrapper[4816]: E0311 12:22:58.351159 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podda177cde-6332-4562-809a-d4bee453cebf] : unable to destroy cgroup paths for cgroup [kubepods besteffort podda177cde-6332-4562-809a-d4bee453cebf] : Timed out while waiting for systemd to remove kubepods-besteffort-podda177cde_6332_4562_809a_d4bee453cebf.slice" pod="openstack/openstack-galera-0" podUID="da177cde-6332-4562-809a-d4bee453cebf" Mar 11 12:22:58 crc kubenswrapper[4816]: I0311 12:22:58.701440 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 12:22:58 crc kubenswrapper[4816]: I0311 12:22:58.754076 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 12:22:58 crc kubenswrapper[4816]: I0311 12:22:58.759929 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 12:23:00 crc kubenswrapper[4816]: I0311 12:23:00.139907 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da177cde-6332-4562-809a-d4bee453cebf" path="/var/lib/kubelet/pods/da177cde-6332-4562-809a-d4bee453cebf/volumes" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.200121 4816 scope.go:117] "RemoveContainer" containerID="0b4c4c1c298f57878044bac49cc49a719acfc3a0f87a1803c19c539d85446637" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.228091 4816 scope.go:117] "RemoveContainer" containerID="8c240088bce92d648a44cfc826778c591f6601fbc70cdbc9325a1348704e1a92" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.250908 4816 scope.go:117] "RemoveContainer" containerID="c460fb14090c9d550203cf386e04b06e3563514702df072e42ad5fc80f7e1872" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.277393 4816 scope.go:117] "RemoveContainer" containerID="0de73c3da519dc3d23fdd410a58406f0ff5aec8f4b5e6483b5c4a546f3b60ef0" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.307116 4816 scope.go:117] "RemoveContainer" containerID="c9628d19e1e9c78361e9677b8afa40ad86295ad47aa0110e4b51ead3233c90ca" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.349655 4816 scope.go:117] "RemoveContainer" containerID="a8f8ba02ac608528a8da635158a48ff55377bd4734bbd746e513b637d5d907d3" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.386647 4816 scope.go:117] "RemoveContainer" containerID="d63f60636f6e53982a24004e405c34ed67500a9193f04b98e8d29856c8e89ee2" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.419309 4816 scope.go:117] "RemoveContainer" containerID="3dfe4dd28e66c33830345db1226180f842f3ae3d59f4fa3a4c553af39dd07c67" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.439218 4816 scope.go:117] "RemoveContainer" containerID="5140353c5c6034db1623dc2f3c189d72ec962703a0a91d22d2e279ead073afac" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.459670 4816 scope.go:117] "RemoveContainer" containerID="ab6525891e160f8b83901124157238a30564c85220f9440c25fb3222634839c7" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.483207 4816 scope.go:117] "RemoveContainer" containerID="d6e7d3be2f695e55ef6abf84c83d060683eae93e0020c12fe8744829cbcc1d6a" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.387468 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pzzv2"] Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388424 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388442 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388470 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388477 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388489 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="openstack-network-exporter" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388499 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="openstack-network-exporter" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388506 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388513 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388523 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32dcc96b-186a-444d-bef3-4c5f117ee652" containerName="kube-state-metrics" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388532 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="32dcc96b-186a-444d-bef3-4c5f117ee652" containerName="kube-state-metrics" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388546 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="cinder-scheduler" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388553 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="cinder-scheduler" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388566 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388574 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388582 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-central-agent" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388588 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-central-agent" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388596 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="sg-core" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388602 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="sg-core" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388615 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-updater" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388625 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-updater" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388635 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-reaper" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388642 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-reaper" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388657 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f4b502-b85f-488c-b55b-27a31479df68" containerName="nova-scheduler-scheduler" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388664 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f4b502-b85f-488c-b55b-27a31479df68" containerName="nova-scheduler-scheduler" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388675 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388682 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388700 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388707 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-api" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388751 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerName="mysql-bootstrap" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388760 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerName="mysql-bootstrap" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388772 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerName="setup-container" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388780 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerName="setup-container" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388790 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c180505-72c6-498d-bfa5-05f689692bd2" containerName="keystone-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388799 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c180505-72c6-498d-bfa5-05f689692bd2" containerName="keystone-api" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388811 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388821 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388832 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388839 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388852 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388859 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388867 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5030028c-f574-4334-a837-2430761524b4" containerName="memcached" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388874 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5030028c-f574-4334-a837-2430761524b4" containerName="memcached" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388885 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da177cde-6332-4562-809a-d4bee453cebf" containerName="galera" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388892 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="da177cde-6332-4562-809a-d4bee453cebf" containerName="galera" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388901 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388908 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388919 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="ovn-northd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388926 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="ovn-northd" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388932 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388940 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388952 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388959 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388969 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388976 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388986 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388994 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-api" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389003 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-updater" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389012 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-updater" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389024 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="probe" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389031 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="probe" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389044 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389051 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389063 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26aea2df-f497-478d-b953-060189ef2569" containerName="setup-container" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389070 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="26aea2df-f497-478d-b953-060189ef2569" containerName="setup-container" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389081 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server-init" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389088 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server-init" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389099 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389106 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-server" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389118 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389126 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-server" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389138 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="proxy-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389146 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="proxy-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389157 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerName="galera" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389164 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerName="galera" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389172 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389180 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389189 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389197 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389208 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da177cde-6332-4562-809a-d4bee453cebf" containerName="mysql-bootstrap" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389216 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="da177cde-6332-4562-809a-d4bee453cebf" containerName="mysql-bootstrap" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389228 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="rsync" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389236 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="rsync" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389267 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389276 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389289 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26aea2df-f497-478d-b953-060189ef2569" containerName="rabbitmq" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389296 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="26aea2df-f497-478d-b953-060189ef2569" containerName="rabbitmq" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389305 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389313 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389328 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389336 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389348 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-expirer" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389356 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-expirer" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389365 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389373 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389387 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389395 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389407 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389415 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389431 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-notification-agent" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389440 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-notification-agent" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389449 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389517 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-api" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389538 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389545 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389586 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63567eba-cc2a-4168-9e81-51c1daed5482" containerName="nova-cell1-conductor-conductor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389596 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="63567eba-cc2a-4168-9e81-51c1daed5482" containerName="nova-cell1-conductor-conductor" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389607 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389615 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-server" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389624 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-metadata" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389634 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-metadata" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389645 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerName="rabbitmq" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389653 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerName="rabbitmq" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389662 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="swift-recon-cron" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389669 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="swift-recon-cron" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389696 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" containerName="nova-cell0-conductor-conductor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389705 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" containerName="nova-cell0-conductor-conductor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390095 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390136 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c180505-72c6-498d-bfa5-05f689692bd2" containerName="keystone-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390175 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-metadata" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390190 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390202 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-expirer" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390215 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390676 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390702 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390730 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="32dcc96b-186a-444d-bef3-4c5f117ee652" containerName="kube-state-metrics" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390741 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390752 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="63567eba-cc2a-4168-9e81-51c1daed5482" containerName="nova-cell1-conductor-conductor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390767 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390778 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390793 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390804 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerName="rabbitmq" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390814 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390825 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="cinder-scheduler" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390839 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="openstack-network-exporter" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390849 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="proxy-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390857 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerName="galera" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390868 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390876 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390885 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-reaper" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390899 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="da177cde-6332-4562-809a-d4bee453cebf" containerName="galera" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390907 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="probe" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390916 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-updater" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390925 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="rsync" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390934 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390942 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="26aea2df-f497-478d-b953-060189ef2569" containerName="rabbitmq" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390949 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390962 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390973 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-notification-agent" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390982 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390990 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="swift-recon-cron" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391000 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391012 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391021 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391032 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391043 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391055 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-updater" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391063 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f4b502-b85f-488c-b55b-27a31479df68" containerName="nova-scheduler-scheduler" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391072 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="5030028c-f574-4334-a837-2430761524b4" containerName="memcached" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391083 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391094 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391104 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391116 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="ovn-northd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391125 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" containerName="nova-cell0-conductor-conductor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391134 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391142 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-central-agent" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391154 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391163 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391172 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391184 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="sg-core" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.392913 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.403089 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzzv2"] Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.495746 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-catalog-content\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.496329 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-utilities\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.496470 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jcl\" (UniqueName: \"kubernetes.io/projected/1f7330c4-f58c-4f88-b3bb-57a9330ff446-kube-api-access-d6jcl\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.598394 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-catalog-content\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.598504 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-utilities\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.598561 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6jcl\" (UniqueName: \"kubernetes.io/projected/1f7330c4-f58c-4f88-b3bb-57a9330ff446-kube-api-access-d6jcl\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.599007 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-catalog-content\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.599340 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-utilities\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.619081 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6jcl\" (UniqueName: \"kubernetes.io/projected/1f7330c4-f58c-4f88-b3bb-57a9330ff446-kube-api-access-d6jcl\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.716511 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:51 crc kubenswrapper[4816]: I0311 12:23:51.245264 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzzv2"] Mar 11 12:23:51 crc kubenswrapper[4816]: W0311 12:23:51.248216 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7330c4_f58c_4f88_b3bb_57a9330ff446.slice/crio-43e44bc48337ed8b379562414b926c9f09f2c6ffa9dd0f3c3749a4ce1d503221 WatchSource:0}: Error finding container 43e44bc48337ed8b379562414b926c9f09f2c6ffa9dd0f3c3749a4ce1d503221: Status 404 returned error can't find the container with id 43e44bc48337ed8b379562414b926c9f09f2c6ffa9dd0f3c3749a4ce1d503221 Mar 11 12:23:52 crc kubenswrapper[4816]: I0311 12:23:52.180161 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerID="dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47" exitCode=0 Mar 11 12:23:52 crc kubenswrapper[4816]: I0311 12:23:52.180288 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzzv2" event={"ID":"1f7330c4-f58c-4f88-b3bb-57a9330ff446","Type":"ContainerDied","Data":"dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47"} Mar 11 12:23:52 crc kubenswrapper[4816]: I0311 12:23:52.180649 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzzv2" event={"ID":"1f7330c4-f58c-4f88-b3bb-57a9330ff446","Type":"ContainerStarted","Data":"43e44bc48337ed8b379562414b926c9f09f2c6ffa9dd0f3c3749a4ce1d503221"} Mar 11 12:23:54 crc kubenswrapper[4816]: I0311 12:23:54.202660 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerID="0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4" exitCode=0 Mar 11 12:23:54 crc kubenswrapper[4816]: I0311 12:23:54.203064 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzzv2" event={"ID":"1f7330c4-f58c-4f88-b3bb-57a9330ff446","Type":"ContainerDied","Data":"0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4"} Mar 11 12:23:55 crc kubenswrapper[4816]: I0311 12:23:55.214882 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzzv2" event={"ID":"1f7330c4-f58c-4f88-b3bb-57a9330ff446","Type":"ContainerStarted","Data":"604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b"} Mar 11 12:23:55 crc kubenswrapper[4816]: I0311 12:23:55.238312 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pzzv2" podStartSLOduration=2.7595729159999998 podStartE2EDuration="5.238286644s" podCreationTimestamp="2026-03-11 12:23:50 +0000 UTC" firstStartedPulling="2026-03-11 12:23:52.182111565 +0000 UTC m=+1518.773375532" lastFinishedPulling="2026-03-11 12:23:54.660825293 +0000 UTC m=+1521.252089260" observedRunningTime="2026-03-11 12:23:55.233851866 +0000 UTC m=+1521.825115843" watchObservedRunningTime="2026-03-11 12:23:55.238286644 +0000 UTC m=+1521.829550611" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.145294 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553864-r52vk"] Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.147125 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553864-r52vk" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.150239 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.151230 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.151926 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.155614 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553864-r52vk"] Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.257390 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wldbm\" (UniqueName: \"kubernetes.io/projected/b1c2a96e-0361-49ae-b1d2-795744511b15-kube-api-access-wldbm\") pod \"auto-csr-approver-29553864-r52vk\" (UID: \"b1c2a96e-0361-49ae-b1d2-795744511b15\") " pod="openshift-infra/auto-csr-approver-29553864-r52vk" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.359362 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wldbm\" (UniqueName: \"kubernetes.io/projected/b1c2a96e-0361-49ae-b1d2-795744511b15-kube-api-access-wldbm\") pod \"auto-csr-approver-29553864-r52vk\" (UID: \"b1c2a96e-0361-49ae-b1d2-795744511b15\") " pod="openshift-infra/auto-csr-approver-29553864-r52vk" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.384906 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wldbm\" (UniqueName: \"kubernetes.io/projected/b1c2a96e-0361-49ae-b1d2-795744511b15-kube-api-access-wldbm\") pod \"auto-csr-approver-29553864-r52vk\" (UID: \"b1c2a96e-0361-49ae-b1d2-795744511b15\") " pod="openshift-infra/auto-csr-approver-29553864-r52vk" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.482594 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553864-r52vk" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.717037 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.717131 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.768760 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.942672 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553864-r52vk"] Mar 11 12:24:01 crc kubenswrapper[4816]: I0311 12:24:01.267963 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553864-r52vk" event={"ID":"b1c2a96e-0361-49ae-b1d2-795744511b15","Type":"ContainerStarted","Data":"de6b160ba6a61c023d6fa94cf65dda0513e45282ed69b2420032bf641b5e229b"} Mar 11 12:24:01 crc kubenswrapper[4816]: I0311 12:24:01.320363 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:24:01 crc kubenswrapper[4816]: I0311 12:24:01.388563 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzzv2"] Mar 11 12:24:02 crc kubenswrapper[4816]: I0311 12:24:02.284089 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553864-r52vk" event={"ID":"b1c2a96e-0361-49ae-b1d2-795744511b15","Type":"ContainerStarted","Data":"abd9d62fb0ff8a700d3029ac698637da8b39d38073d7e8f33a437d1d746d66d8"} Mar 11 12:24:02 crc kubenswrapper[4816]: I0311 12:24:02.310331 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553864-r52vk" podStartSLOduration=1.425145175 podStartE2EDuration="2.31030505s" podCreationTimestamp="2026-03-11 12:24:00 +0000 UTC" firstStartedPulling="2026-03-11 12:24:00.949632934 +0000 UTC m=+1527.540896911" lastFinishedPulling="2026-03-11 12:24:01.834792819 +0000 UTC m=+1528.426056786" observedRunningTime="2026-03-11 12:24:02.302348091 +0000 UTC m=+1528.893612068" watchObservedRunningTime="2026-03-11 12:24:02.31030505 +0000 UTC m=+1528.901569037" Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.293598 4816 generic.go:334] "Generic (PLEG): container finished" podID="b1c2a96e-0361-49ae-b1d2-795744511b15" containerID="abd9d62fb0ff8a700d3029ac698637da8b39d38073d7e8f33a437d1d746d66d8" exitCode=0 Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.293653 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553864-r52vk" event={"ID":"b1c2a96e-0361-49ae-b1d2-795744511b15","Type":"ContainerDied","Data":"abd9d62fb0ff8a700d3029ac698637da8b39d38073d7e8f33a437d1d746d66d8"} Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.293892 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pzzv2" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="registry-server" containerID="cri-o://604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b" gracePeriod=2 Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.767645 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.832063 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-utilities\") pod \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.832199 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6jcl\" (UniqueName: \"kubernetes.io/projected/1f7330c4-f58c-4f88-b3bb-57a9330ff446-kube-api-access-d6jcl\") pod \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.832243 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-catalog-content\") pod \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.833577 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-utilities" (OuterVolumeSpecName: "utilities") pod "1f7330c4-f58c-4f88-b3bb-57a9330ff446" (UID: "1f7330c4-f58c-4f88-b3bb-57a9330ff446"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.840566 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7330c4-f58c-4f88-b3bb-57a9330ff446-kube-api-access-d6jcl" (OuterVolumeSpecName: "kube-api-access-d6jcl") pod "1f7330c4-f58c-4f88-b3bb-57a9330ff446" (UID: "1f7330c4-f58c-4f88-b3bb-57a9330ff446"). InnerVolumeSpecName "kube-api-access-d6jcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.934213 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.934765 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6jcl\" (UniqueName: \"kubernetes.io/projected/1f7330c4-f58c-4f88-b3bb-57a9330ff446-kube-api-access-d6jcl\") on node \"crc\" DevicePath \"\"" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.303936 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f7330c4-f58c-4f88-b3bb-57a9330ff446" (UID: "1f7330c4-f58c-4f88-b3bb-57a9330ff446"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.308044 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerID="604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b" exitCode=0 Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.308122 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzzv2" event={"ID":"1f7330c4-f58c-4f88-b3bb-57a9330ff446","Type":"ContainerDied","Data":"604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b"} Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.308202 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzzv2" event={"ID":"1f7330c4-f58c-4f88-b3bb-57a9330ff446","Type":"ContainerDied","Data":"43e44bc48337ed8b379562414b926c9f09f2c6ffa9dd0f3c3749a4ce1d503221"} Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.308230 4816 scope.go:117] "RemoveContainer" containerID="604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.308142 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.347854 4816 scope.go:117] "RemoveContainer" containerID="0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.348172 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.357882 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzzv2"] Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.371230 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pzzv2"] Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.387076 4816 scope.go:117] "RemoveContainer" containerID="dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.417940 4816 scope.go:117] "RemoveContainer" containerID="604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b" Mar 11 12:24:04 crc kubenswrapper[4816]: E0311 12:24:04.418944 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b\": container with ID starting with 604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b not found: ID does not exist" containerID="604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.418991 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b"} err="failed to get container status \"604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b\": rpc error: code = NotFound desc = could not find container \"604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b\": container with ID starting with 604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b not found: ID does not exist" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.419016 4816 scope.go:117] "RemoveContainer" containerID="0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4" Mar 11 12:24:04 crc kubenswrapper[4816]: E0311 12:24:04.419982 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4\": container with ID starting with 0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4 not found: ID does not exist" containerID="0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.420079 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4"} err="failed to get container status \"0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4\": rpc error: code = NotFound desc = could not find container \"0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4\": container with ID starting with 0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4 not found: ID does not exist" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.420131 4816 scope.go:117] "RemoveContainer" containerID="dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47" Mar 11 12:24:04 crc kubenswrapper[4816]: E0311 12:24:04.420641 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47\": container with ID starting with dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47 not found: ID does not exist" containerID="dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.420684 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47"} err="failed to get container status \"dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47\": rpc error: code = NotFound desc = could not find container \"dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47\": container with ID starting with dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47 not found: ID does not exist" Mar 11 12:24:04 crc kubenswrapper[4816]: E0311 12:24:04.540235 4816 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.691597 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553864-r52vk" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.753728 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wldbm\" (UniqueName: \"kubernetes.io/projected/b1c2a96e-0361-49ae-b1d2-795744511b15-kube-api-access-wldbm\") pod \"b1c2a96e-0361-49ae-b1d2-795744511b15\" (UID: \"b1c2a96e-0361-49ae-b1d2-795744511b15\") " Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.759736 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1c2a96e-0361-49ae-b1d2-795744511b15-kube-api-access-wldbm" (OuterVolumeSpecName: "kube-api-access-wldbm") pod "b1c2a96e-0361-49ae-b1d2-795744511b15" (UID: "b1c2a96e-0361-49ae-b1d2-795744511b15"). InnerVolumeSpecName "kube-api-access-wldbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.855692 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wldbm\" (UniqueName: \"kubernetes.io/projected/b1c2a96e-0361-49ae-b1d2-795744511b15-kube-api-access-wldbm\") on node \"crc\" DevicePath \"\"" Mar 11 12:24:05 crc kubenswrapper[4816]: I0311 12:24:05.321561 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553864-r52vk" event={"ID":"b1c2a96e-0361-49ae-b1d2-795744511b15","Type":"ContainerDied","Data":"de6b160ba6a61c023d6fa94cf65dda0513e45282ed69b2420032bf641b5e229b"} Mar 11 12:24:05 crc kubenswrapper[4816]: I0311 12:24:05.321614 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de6b160ba6a61c023d6fa94cf65dda0513e45282ed69b2420032bf641b5e229b" Mar 11 12:24:05 crc kubenswrapper[4816]: I0311 12:24:05.321620 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553864-r52vk" Mar 11 12:24:05 crc kubenswrapper[4816]: I0311 12:24:05.379679 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553858-brk44"] Mar 11 12:24:05 crc kubenswrapper[4816]: I0311 12:24:05.389441 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553858-brk44"] Mar 11 12:24:06 crc kubenswrapper[4816]: I0311 12:24:06.142924 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" path="/var/lib/kubelet/pods/1f7330c4-f58c-4f88-b3bb-57a9330ff446/volumes" Mar 11 12:24:06 crc kubenswrapper[4816]: I0311 12:24:06.144145 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d8cc8e-8af3-41b3-bb8c-6e4e10f00193" path="/var/lib/kubelet/pods/99d8cc8e-8af3-41b3-bb8c-6e4e10f00193/volumes" Mar 11 12:24:09 crc kubenswrapper[4816]: I0311 12:24:09.514951 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:24:09 crc kubenswrapper[4816]: I0311 12:24:09.515548 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:24:39 crc kubenswrapper[4816]: I0311 12:24:39.515226 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:24:39 crc kubenswrapper[4816]: I0311 12:24:39.515867 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.711882 4816 scope.go:117] "RemoveContainer" containerID="0d27f73615e32aa404576eea9593c729502e37fe26b5c92717c4bee0b43a98e6" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.764904 4816 scope.go:117] "RemoveContainer" containerID="315146a94731475a01dc83fd91cffc1dc07e3b8364e3b5f9f4c74f1dffcbe0c4" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.808012 4816 scope.go:117] "RemoveContainer" containerID="0800101b998bc39fbc15280d1397d1d43d3447b5f82870b95f5c0e69e60ff601" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.850873 4816 scope.go:117] "RemoveContainer" containerID="ae9a5cdf2df1a6846c30df048ff752db89454e8f6330fe73c2c82145d550960b" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.871283 4816 scope.go:117] "RemoveContainer" containerID="19aee09219637e7a5ab326ab09421619dc187f94f0843e938baaa3e47920a542" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.897931 4816 scope.go:117] "RemoveContainer" containerID="e5f24ad51eefb627e15014c3582b64a13468c820d9ad9ccfa53acd2f0fb30054" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.927812 4816 scope.go:117] "RemoveContainer" containerID="5d5e0febf80ed4282e61f8380eff77836a90544898e5ff129bf2d82dd15449ea" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.948206 4816 scope.go:117] "RemoveContainer" containerID="5f7cdb31826f59ca1238a145635210cb534eb8b83a42327083142e83ef21c961" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.984657 4816 scope.go:117] "RemoveContainer" containerID="369058cbb8fa5b6fcca641d0c6bacd8fb984decb4576950458c2fae4a2d14692" Mar 11 12:24:46 crc kubenswrapper[4816]: I0311 12:24:46.005318 4816 scope.go:117] "RemoveContainer" containerID="234b74962788658b9515670058c8f55bb2409a552461ddec719b37310c8f7e0d" Mar 11 12:24:46 crc kubenswrapper[4816]: I0311 12:24:46.025714 4816 scope.go:117] "RemoveContainer" containerID="619b2be9e8a3f61b134d163bc3ebb4105259f3d6eadad7ea8f76de2333bbeac4" Mar 11 12:24:46 crc kubenswrapper[4816]: I0311 12:24:46.070559 4816 scope.go:117] "RemoveContainer" containerID="16af7949a711342a3610523f5b8fbb074d336f04c7a6eb010f9128a10368ad76" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.163220 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x58xn"] Mar 11 12:24:55 crc kubenswrapper[4816]: E0311 12:24:55.164451 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="extract-utilities" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.164468 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="extract-utilities" Mar 11 12:24:55 crc kubenswrapper[4816]: E0311 12:24:55.164487 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="extract-content" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.164495 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="extract-content" Mar 11 12:24:55 crc kubenswrapper[4816]: E0311 12:24:55.164507 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c2a96e-0361-49ae-b1d2-795744511b15" containerName="oc" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.164513 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c2a96e-0361-49ae-b1d2-795744511b15" containerName="oc" Mar 11 12:24:55 crc kubenswrapper[4816]: E0311 12:24:55.164526 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="registry-server" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.164532 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="registry-server" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.164672 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="registry-server" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.164694 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1c2a96e-0361-49ae-b1d2-795744511b15" containerName="oc" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.165807 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.184889 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x58xn"] Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.268168 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-utilities\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.268307 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r4xj\" (UniqueName: \"kubernetes.io/projected/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-kube-api-access-6r4xj\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.268357 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-catalog-content\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.369915 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-utilities\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.370021 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r4xj\" (UniqueName: \"kubernetes.io/projected/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-kube-api-access-6r4xj\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.370076 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-catalog-content\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.370927 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-utilities\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.371241 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-catalog-content\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.394132 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r4xj\" (UniqueName: \"kubernetes.io/projected/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-kube-api-access-6r4xj\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.483981 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.909916 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x58xn"] Mar 11 12:24:56 crc kubenswrapper[4816]: I0311 12:24:56.827642 4816 generic.go:334] "Generic (PLEG): container finished" podID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerID="6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae" exitCode=0 Mar 11 12:24:56 crc kubenswrapper[4816]: I0311 12:24:56.827865 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x58xn" event={"ID":"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d","Type":"ContainerDied","Data":"6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae"} Mar 11 12:24:56 crc kubenswrapper[4816]: I0311 12:24:56.827953 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x58xn" event={"ID":"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d","Type":"ContainerStarted","Data":"fc838844e6bf9c2ab8bd6f7866edb82541f296d43517f2a59b699d6ac93eff82"} Mar 11 12:24:57 crc kubenswrapper[4816]: I0311 12:24:57.841319 4816 generic.go:334] "Generic (PLEG): container finished" podID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerID="c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126" exitCode=0 Mar 11 12:24:57 crc kubenswrapper[4816]: I0311 12:24:57.841432 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x58xn" event={"ID":"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d","Type":"ContainerDied","Data":"c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126"} Mar 11 12:24:58 crc kubenswrapper[4816]: I0311 12:24:58.852558 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x58xn" event={"ID":"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d","Type":"ContainerStarted","Data":"38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a"} Mar 11 12:24:58 crc kubenswrapper[4816]: I0311 12:24:58.875141 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x58xn" podStartSLOduration=2.247204054 podStartE2EDuration="3.875112653s" podCreationTimestamp="2026-03-11 12:24:55 +0000 UTC" firstStartedPulling="2026-03-11 12:24:56.829916448 +0000 UTC m=+1583.421180415" lastFinishedPulling="2026-03-11 12:24:58.457825047 +0000 UTC m=+1585.049089014" observedRunningTime="2026-03-11 12:24:58.873461376 +0000 UTC m=+1585.464725343" watchObservedRunningTime="2026-03-11 12:24:58.875112653 +0000 UTC m=+1585.466376620" Mar 11 12:25:05 crc kubenswrapper[4816]: I0311 12:25:05.484591 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:25:05 crc kubenswrapper[4816]: I0311 12:25:05.485179 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:25:05 crc kubenswrapper[4816]: I0311 12:25:05.525745 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:25:05 crc kubenswrapper[4816]: I0311 12:25:05.964363 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:25:06 crc kubenswrapper[4816]: I0311 12:25:06.006139 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x58xn"] Mar 11 12:25:07 crc kubenswrapper[4816]: I0311 12:25:07.937766 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x58xn" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="registry-server" containerID="cri-o://38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a" gracePeriod=2 Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.315337 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.378208 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-utilities\") pod \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.378282 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r4xj\" (UniqueName: \"kubernetes.io/projected/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-kube-api-access-6r4xj\") pod \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.378343 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-catalog-content\") pod \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.379233 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-utilities" (OuterVolumeSpecName: "utilities") pod "03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" (UID: "03d94e6d-ee31-419d-9b0b-3c5ed80aab2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.385569 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-kube-api-access-6r4xj" (OuterVolumeSpecName: "kube-api-access-6r4xj") pod "03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" (UID: "03d94e6d-ee31-419d-9b0b-3c5ed80aab2d"). InnerVolumeSpecName "kube-api-access-6r4xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.407048 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" (UID: "03d94e6d-ee31-419d-9b0b-3c5ed80aab2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.479817 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r4xj\" (UniqueName: \"kubernetes.io/projected/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-kube-api-access-6r4xj\") on node \"crc\" DevicePath \"\"" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.479853 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.479863 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.951846 4816 generic.go:334] "Generic (PLEG): container finished" podID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerID="38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a" exitCode=0 Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.951945 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.951975 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x58xn" event={"ID":"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d","Type":"ContainerDied","Data":"38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a"} Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.952633 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x58xn" event={"ID":"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d","Type":"ContainerDied","Data":"fc838844e6bf9c2ab8bd6f7866edb82541f296d43517f2a59b699d6ac93eff82"} Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.952678 4816 scope.go:117] "RemoveContainer" containerID="38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.976866 4816 scope.go:117] "RemoveContainer" containerID="c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.986953 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x58xn"] Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.993441 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x58xn"] Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.006262 4816 scope.go:117] "RemoveContainer" containerID="6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.032059 4816 scope.go:117] "RemoveContainer" containerID="38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a" Mar 11 12:25:09 crc kubenswrapper[4816]: E0311 12:25:09.032720 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a\": container with ID starting with 38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a not found: ID does not exist" containerID="38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.032780 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a"} err="failed to get container status \"38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a\": rpc error: code = NotFound desc = could not find container \"38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a\": container with ID starting with 38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a not found: ID does not exist" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.032831 4816 scope.go:117] "RemoveContainer" containerID="c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126" Mar 11 12:25:09 crc kubenswrapper[4816]: E0311 12:25:09.033237 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126\": container with ID starting with c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126 not found: ID does not exist" containerID="c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.033327 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126"} err="failed to get container status \"c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126\": rpc error: code = NotFound desc = could not find container \"c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126\": container with ID starting with c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126 not found: ID does not exist" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.033370 4816 scope.go:117] "RemoveContainer" containerID="6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae" Mar 11 12:25:09 crc kubenswrapper[4816]: E0311 12:25:09.033846 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae\": container with ID starting with 6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae not found: ID does not exist" containerID="6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.033887 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae"} err="failed to get container status \"6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae\": rpc error: code = NotFound desc = could not find container \"6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae\": container with ID starting with 6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae not found: ID does not exist" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.515316 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.515403 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.515475 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.516280 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.516357 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" gracePeriod=600 Mar 11 12:25:09 crc kubenswrapper[4816]: E0311 12:25:09.639107 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.970439 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" exitCode=0 Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.970495 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3"} Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.970537 4816 scope.go:117] "RemoveContainer" containerID="20e5352a1f18de3da65279dced0572d988bf4c64c45f769d6d0ae47f9c2cef9a" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.971299 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:25:09 crc kubenswrapper[4816]: E0311 12:25:09.971932 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:25:10 crc kubenswrapper[4816]: I0311 12:25:10.142197 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" path="/var/lib/kubelet/pods/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d/volumes" Mar 11 12:25:24 crc kubenswrapper[4816]: I0311 12:25:24.135299 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:25:24 crc kubenswrapper[4816]: E0311 12:25:24.136232 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:25:38 crc kubenswrapper[4816]: I0311 12:25:38.130648 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:25:38 crc kubenswrapper[4816]: E0311 12:25:38.131583 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.250585 4816 scope.go:117] "RemoveContainer" containerID="c04dc0a2663851eac8a9c1faccfd79cf6c27fbce470c4ad0b7499358caea8a06" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.278160 4816 scope.go:117] "RemoveContainer" containerID="6a15e8693d1f25cf8eeefb7b013bbcd57f9676d5cee6b31111e7f71f5ea2e5ca" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.309873 4816 scope.go:117] "RemoveContainer" containerID="0f5d94dc5d9bb04750c9b3d2e89fcd5a5d6e20ec2f4a19899cb047b2927291a4" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.332035 4816 scope.go:117] "RemoveContainer" containerID="691c4f9d45de04f6bb32f82d9d22154b130edce7e7b8b75479f100df834dbbad" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.364889 4816 scope.go:117] "RemoveContainer" containerID="8dc2306ac32e5d795143d562064f5d8e129c4815490ca1bada6d8509ddcc5240" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.387541 4816 scope.go:117] "RemoveContainer" containerID="fd551dbdb54bb8de807a245da392a1fc03bca9f397e581b66063faafeaf38a5f" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.406339 4816 scope.go:117] "RemoveContainer" containerID="c3956854978860cbc650270e665106bd8e95400d5b8cce00a86ed500eb262922" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.471146 4816 scope.go:117] "RemoveContainer" containerID="5e19f1840cfd8f7623e64404579f814579ee6602ca765f964613a90342b26cc2" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.501502 4816 scope.go:117] "RemoveContainer" containerID="c704df83b8c052d797cc33017726ade79e749840ea39268bdd3404b42194d40d" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.540402 4816 scope.go:117] "RemoveContainer" containerID="1f2178fe24813df8bfcc542c32d18ec7c0d7ab550dc406623e692f0465cd6535" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.574141 4816 scope.go:117] "RemoveContainer" containerID="090174f400ae3d182bc1e17d475eb20c26198249c703a798e2b253812bea946b" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.597057 4816 scope.go:117] "RemoveContainer" containerID="2ba25af6bbe93bf77e8ed2bed1866df9a0d1cdcadbd32ffc70070db8155b1914" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.620076 4816 scope.go:117] "RemoveContainer" containerID="cbfbf586e19291c8ee373bf860029353b3f56429b7bf9015d736b9982aa4797f" Mar 11 12:25:53 crc kubenswrapper[4816]: I0311 12:25:53.130822 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:25:53 crc kubenswrapper[4816]: E0311 12:25:53.132522 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.165344 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553866-w7rtm"] Mar 11 12:26:00 crc kubenswrapper[4816]: E0311 12:26:00.166419 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="extract-utilities" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.166437 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="extract-utilities" Mar 11 12:26:00 crc kubenswrapper[4816]: E0311 12:26:00.166460 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="extract-content" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.166468 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="extract-content" Mar 11 12:26:00 crc kubenswrapper[4816]: E0311 12:26:00.166489 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="registry-server" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.166497 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="registry-server" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.166675 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="registry-server" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.167367 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.170691 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.170784 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.170980 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.176448 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553866-w7rtm"] Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.288840 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdkfg\" (UniqueName: \"kubernetes.io/projected/94280e97-4b7e-4a2d-8f1f-c3125f5910bc-kube-api-access-wdkfg\") pod \"auto-csr-approver-29553866-w7rtm\" (UID: \"94280e97-4b7e-4a2d-8f1f-c3125f5910bc\") " pod="openshift-infra/auto-csr-approver-29553866-w7rtm" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.390628 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdkfg\" (UniqueName: \"kubernetes.io/projected/94280e97-4b7e-4a2d-8f1f-c3125f5910bc-kube-api-access-wdkfg\") pod \"auto-csr-approver-29553866-w7rtm\" (UID: \"94280e97-4b7e-4a2d-8f1f-c3125f5910bc\") " pod="openshift-infra/auto-csr-approver-29553866-w7rtm" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.411138 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdkfg\" (UniqueName: \"kubernetes.io/projected/94280e97-4b7e-4a2d-8f1f-c3125f5910bc-kube-api-access-wdkfg\") pod \"auto-csr-approver-29553866-w7rtm\" (UID: \"94280e97-4b7e-4a2d-8f1f-c3125f5910bc\") " pod="openshift-infra/auto-csr-approver-29553866-w7rtm" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.504223 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.953158 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553866-w7rtm"] Mar 11 12:26:01 crc kubenswrapper[4816]: I0311 12:26:01.446358 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" event={"ID":"94280e97-4b7e-4a2d-8f1f-c3125f5910bc","Type":"ContainerStarted","Data":"5f08cc228d5680e5632a5bb4aa691b63f252341df6512e0b7902502d897b8a17"} Mar 11 12:26:02 crc kubenswrapper[4816]: I0311 12:26:02.457780 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" event={"ID":"94280e97-4b7e-4a2d-8f1f-c3125f5910bc","Type":"ContainerStarted","Data":"b6abd0eb57a8c7686ba5edd206e023c171a5ca0fee490573bab4f684459e8043"} Mar 11 12:26:02 crc kubenswrapper[4816]: I0311 12:26:02.483625 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" podStartSLOduration=1.512860462 podStartE2EDuration="2.483600364s" podCreationTimestamp="2026-03-11 12:26:00 +0000 UTC" firstStartedPulling="2026-03-11 12:26:00.963978039 +0000 UTC m=+1647.555242016" lastFinishedPulling="2026-03-11 12:26:01.934717961 +0000 UTC m=+1648.525981918" observedRunningTime="2026-03-11 12:26:02.472572048 +0000 UTC m=+1649.063836025" watchObservedRunningTime="2026-03-11 12:26:02.483600364 +0000 UTC m=+1649.074864321" Mar 11 12:26:02 crc kubenswrapper[4816]: E0311 12:26:02.520492 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94280e97_4b7e_4a2d_8f1f_c3125f5910bc.slice/crio-b6abd0eb57a8c7686ba5edd206e023c171a5ca0fee490573bab4f684459e8043.scope\": RecentStats: unable to find data in memory cache]" Mar 11 12:26:03 crc kubenswrapper[4816]: I0311 12:26:03.468917 4816 generic.go:334] "Generic (PLEG): container finished" podID="94280e97-4b7e-4a2d-8f1f-c3125f5910bc" containerID="b6abd0eb57a8c7686ba5edd206e023c171a5ca0fee490573bab4f684459e8043" exitCode=0 Mar 11 12:26:03 crc kubenswrapper[4816]: I0311 12:26:03.468982 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" event={"ID":"94280e97-4b7e-4a2d-8f1f-c3125f5910bc","Type":"ContainerDied","Data":"b6abd0eb57a8c7686ba5edd206e023c171a5ca0fee490573bab4f684459e8043"} Mar 11 12:26:04 crc kubenswrapper[4816]: I0311 12:26:04.773023 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" Mar 11 12:26:04 crc kubenswrapper[4816]: I0311 12:26:04.967892 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdkfg\" (UniqueName: \"kubernetes.io/projected/94280e97-4b7e-4a2d-8f1f-c3125f5910bc-kube-api-access-wdkfg\") pod \"94280e97-4b7e-4a2d-8f1f-c3125f5910bc\" (UID: \"94280e97-4b7e-4a2d-8f1f-c3125f5910bc\") " Mar 11 12:26:04 crc kubenswrapper[4816]: I0311 12:26:04.979581 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94280e97-4b7e-4a2d-8f1f-c3125f5910bc-kube-api-access-wdkfg" (OuterVolumeSpecName: "kube-api-access-wdkfg") pod "94280e97-4b7e-4a2d-8f1f-c3125f5910bc" (UID: "94280e97-4b7e-4a2d-8f1f-c3125f5910bc"). InnerVolumeSpecName "kube-api-access-wdkfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:26:05 crc kubenswrapper[4816]: I0311 12:26:05.070332 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdkfg\" (UniqueName: \"kubernetes.io/projected/94280e97-4b7e-4a2d-8f1f-c3125f5910bc-kube-api-access-wdkfg\") on node \"crc\" DevicePath \"\"" Mar 11 12:26:05 crc kubenswrapper[4816]: I0311 12:26:05.486410 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" event={"ID":"94280e97-4b7e-4a2d-8f1f-c3125f5910bc","Type":"ContainerDied","Data":"5f08cc228d5680e5632a5bb4aa691b63f252341df6512e0b7902502d897b8a17"} Mar 11 12:26:05 crc kubenswrapper[4816]: I0311 12:26:05.486470 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f08cc228d5680e5632a5bb4aa691b63f252341df6512e0b7902502d897b8a17" Mar 11 12:26:05 crc kubenswrapper[4816]: I0311 12:26:05.486475 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" Mar 11 12:26:05 crc kubenswrapper[4816]: I0311 12:26:05.540763 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553860-9kp4n"] Mar 11 12:26:05 crc kubenswrapper[4816]: I0311 12:26:05.545987 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553860-9kp4n"] Mar 11 12:26:06 crc kubenswrapper[4816]: I0311 12:26:06.139867 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9" path="/var/lib/kubelet/pods/4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9/volumes" Mar 11 12:26:08 crc kubenswrapper[4816]: I0311 12:26:08.132782 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:26:08 crc kubenswrapper[4816]: E0311 12:26:08.133329 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:26:20 crc kubenswrapper[4816]: I0311 12:26:20.131819 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:26:20 crc kubenswrapper[4816]: E0311 12:26:20.133166 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:26:34 crc kubenswrapper[4816]: I0311 12:26:34.137318 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:26:34 crc kubenswrapper[4816]: E0311 12:26:34.138334 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:26:46 crc kubenswrapper[4816]: I0311 12:26:46.802061 4816 scope.go:117] "RemoveContainer" containerID="f424c56cb69a088a064cc5d2e599b7db758b5e10ecf876c1586a8816a4d96acd" Mar 11 12:26:46 crc kubenswrapper[4816]: I0311 12:26:46.868096 4816 scope.go:117] "RemoveContainer" containerID="adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6" Mar 11 12:26:46 crc kubenswrapper[4816]: I0311 12:26:46.890131 4816 scope.go:117] "RemoveContainer" containerID="73c8dd7cd36356a8399521ca85923fa5bea70d1a67253cbf2ad9c716aae771dd" Mar 11 12:26:46 crc kubenswrapper[4816]: I0311 12:26:46.911495 4816 scope.go:117] "RemoveContainer" containerID="7499f2a9acd657b210b2f77e2cefe97fa749ba96e868296d76960eaa9ed38ee8" Mar 11 12:26:46 crc kubenswrapper[4816]: I0311 12:26:46.964422 4816 scope.go:117] "RemoveContainer" containerID="7074db26ba14c2f5793b32a499e15ff64a76fc4764f04041e3b7367e813d1eb6" Mar 11 12:26:47 crc kubenswrapper[4816]: I0311 12:26:47.013509 4816 scope.go:117] "RemoveContainer" containerID="393812c2bcecafdfa2f0e8dd848197ee6392877cb2e73b5a2b5b12d642a2ed5c" Mar 11 12:26:47 crc kubenswrapper[4816]: I0311 12:26:47.052815 4816 scope.go:117] "RemoveContainer" containerID="c5bd6af011971fbc8f1597775b66953c97da9697d9d038a8ec30a1201d7a28f6" Mar 11 12:26:49 crc kubenswrapper[4816]: I0311 12:26:49.131830 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:26:49 crc kubenswrapper[4816]: E0311 12:26:49.132137 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:27:00 crc kubenswrapper[4816]: I0311 12:27:00.130232 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:27:00 crc kubenswrapper[4816]: E0311 12:27:00.131112 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:27:13 crc kubenswrapper[4816]: I0311 12:27:13.130365 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:27:13 crc kubenswrapper[4816]: E0311 12:27:13.131284 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:27:24 crc kubenswrapper[4816]: I0311 12:27:24.138842 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:27:24 crc kubenswrapper[4816]: E0311 12:27:24.141606 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:27:36 crc kubenswrapper[4816]: I0311 12:27:36.130988 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:27:36 crc kubenswrapper[4816]: E0311 12:27:36.131745 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:27:47 crc kubenswrapper[4816]: I0311 12:27:47.146354 4816 scope.go:117] "RemoveContainer" containerID="f7560f8d6f98f14204afbbce69a7ff86d5f07a2d1a84e68d20701b7c7e5ce84d" Mar 11 12:27:47 crc kubenswrapper[4816]: I0311 12:27:47.171709 4816 scope.go:117] "RemoveContainer" containerID="9fc317ca9311d71a32a61a06236255eddc3a32782036027513c2583e902eb2de" Mar 11 12:27:47 crc kubenswrapper[4816]: I0311 12:27:47.190976 4816 scope.go:117] "RemoveContainer" containerID="adf484d20700d25957189d351eb669acaae4683a20326267761afe30c6a7e50c" Mar 11 12:27:49 crc kubenswrapper[4816]: I0311 12:27:49.131172 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:27:49 crc kubenswrapper[4816]: E0311 12:27:49.131813 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.154665 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553868-qmhvt"] Mar 11 12:28:00 crc kubenswrapper[4816]: E0311 12:28:00.162775 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94280e97-4b7e-4a2d-8f1f-c3125f5910bc" containerName="oc" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.162805 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="94280e97-4b7e-4a2d-8f1f-c3125f5910bc" containerName="oc" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.162959 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="94280e97-4b7e-4a2d-8f1f-c3125f5910bc" containerName="oc" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.163529 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553868-qmhvt" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.164277 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553868-qmhvt"] Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.165833 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.166043 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.166043 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.275852 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqwrp\" (UniqueName: \"kubernetes.io/projected/7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf-kube-api-access-rqwrp\") pod \"auto-csr-approver-29553868-qmhvt\" (UID: \"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf\") " pod="openshift-infra/auto-csr-approver-29553868-qmhvt" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.377574 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqwrp\" (UniqueName: \"kubernetes.io/projected/7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf-kube-api-access-rqwrp\") pod \"auto-csr-approver-29553868-qmhvt\" (UID: \"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf\") " pod="openshift-infra/auto-csr-approver-29553868-qmhvt" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.403480 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqwrp\" (UniqueName: \"kubernetes.io/projected/7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf-kube-api-access-rqwrp\") pod \"auto-csr-approver-29553868-qmhvt\" (UID: \"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf\") " pod="openshift-infra/auto-csr-approver-29553868-qmhvt" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.489850 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553868-qmhvt" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.914637 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553868-qmhvt"] Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.924226 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:28:01 crc kubenswrapper[4816]: I0311 12:28:01.131126 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:28:01 crc kubenswrapper[4816]: E0311 12:28:01.131482 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:28:01 crc kubenswrapper[4816]: I0311 12:28:01.449311 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553868-qmhvt" event={"ID":"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf","Type":"ContainerStarted","Data":"10969e02780e70dd2f19efe9e840c72d7513c2438e871aab3b04515a76a41e86"} Mar 11 12:28:03 crc kubenswrapper[4816]: I0311 12:28:03.469190 4816 generic.go:334] "Generic (PLEG): container finished" podID="7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf" containerID="6f896b214f33da369f143727ecfdb3b64f134749ec6e50337a2f62ef03d15c62" exitCode=0 Mar 11 12:28:03 crc kubenswrapper[4816]: I0311 12:28:03.469280 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553868-qmhvt" event={"ID":"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf","Type":"ContainerDied","Data":"6f896b214f33da369f143727ecfdb3b64f134749ec6e50337a2f62ef03d15c62"} Mar 11 12:28:04 crc kubenswrapper[4816]: I0311 12:28:04.839215 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553868-qmhvt" Mar 11 12:28:04 crc kubenswrapper[4816]: I0311 12:28:04.950784 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqwrp\" (UniqueName: \"kubernetes.io/projected/7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf-kube-api-access-rqwrp\") pod \"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf\" (UID: \"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf\") " Mar 11 12:28:04 crc kubenswrapper[4816]: I0311 12:28:04.964478 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf-kube-api-access-rqwrp" (OuterVolumeSpecName: "kube-api-access-rqwrp") pod "7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf" (UID: "7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf"). InnerVolumeSpecName "kube-api-access-rqwrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:28:05 crc kubenswrapper[4816]: I0311 12:28:05.052802 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqwrp\" (UniqueName: \"kubernetes.io/projected/7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf-kube-api-access-rqwrp\") on node \"crc\" DevicePath \"\"" Mar 11 12:28:05 crc kubenswrapper[4816]: I0311 12:28:05.489496 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553868-qmhvt" event={"ID":"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf","Type":"ContainerDied","Data":"10969e02780e70dd2f19efe9e840c72d7513c2438e871aab3b04515a76a41e86"} Mar 11 12:28:05 crc kubenswrapper[4816]: I0311 12:28:05.489575 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10969e02780e70dd2f19efe9e840c72d7513c2438e871aab3b04515a76a41e86" Mar 11 12:28:05 crc kubenswrapper[4816]: I0311 12:28:05.489674 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553868-qmhvt" Mar 11 12:28:05 crc kubenswrapper[4816]: I0311 12:28:05.916335 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553862-ldg69"] Mar 11 12:28:05 crc kubenswrapper[4816]: I0311 12:28:05.921509 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553862-ldg69"] Mar 11 12:28:06 crc kubenswrapper[4816]: I0311 12:28:06.149999 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="787da494-4b4f-4a96-9e39-45179c456dc0" path="/var/lib/kubelet/pods/787da494-4b4f-4a96-9e39-45179c456dc0/volumes" Mar 11 12:28:12 crc kubenswrapper[4816]: I0311 12:28:12.131147 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:28:12 crc kubenswrapper[4816]: E0311 12:28:12.132159 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:28:26 crc kubenswrapper[4816]: I0311 12:28:26.130370 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:28:26 crc kubenswrapper[4816]: E0311 12:28:26.131639 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:28:38 crc kubenswrapper[4816]: I0311 12:28:38.131509 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:28:38 crc kubenswrapper[4816]: E0311 12:28:38.132537 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:28:47 crc kubenswrapper[4816]: I0311 12:28:47.272902 4816 scope.go:117] "RemoveContainer" containerID="d0874559d26089e67dcd3126789f0cf0dc3ed1323323af96fe7e8ee67fbd532f" Mar 11 12:28:49 crc kubenswrapper[4816]: I0311 12:28:49.133234 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:28:49 crc kubenswrapper[4816]: E0311 12:28:49.134784 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:29:01 crc kubenswrapper[4816]: I0311 12:29:01.129939 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:29:01 crc kubenswrapper[4816]: E0311 12:29:01.130785 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:29:13 crc kubenswrapper[4816]: I0311 12:29:13.130145 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:29:13 crc kubenswrapper[4816]: E0311 12:29:13.131985 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:29:25 crc kubenswrapper[4816]: I0311 12:29:25.130678 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:29:25 crc kubenswrapper[4816]: E0311 12:29:25.132162 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:29:39 crc kubenswrapper[4816]: I0311 12:29:39.131886 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:29:39 crc kubenswrapper[4816]: E0311 12:29:39.132728 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:29:50 crc kubenswrapper[4816]: I0311 12:29:50.131341 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:29:50 crc kubenswrapper[4816]: E0311 12:29:50.132291 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.163716 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553870-5ndrc"] Mar 11 12:30:00 crc kubenswrapper[4816]: E0311 12:30:00.164832 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf" containerName="oc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.164855 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf" containerName="oc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.165097 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf" containerName="oc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.165959 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.169336 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553870-5ndrc"] Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.170122 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.170417 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.170707 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.179156 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j"] Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.180411 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.190046 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.190063 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.194574 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j"] Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.330715 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-config-volume\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.330767 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxwjv\" (UniqueName: \"kubernetes.io/projected/2920f68a-c5bb-474c-929b-09ced109bcc0-kube-api-access-xxwjv\") pod \"auto-csr-approver-29553870-5ndrc\" (UID: \"2920f68a-c5bb-474c-929b-09ced109bcc0\") " pod="openshift-infra/auto-csr-approver-29553870-5ndrc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.331053 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-secret-volume\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.331324 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4zb5\" (UniqueName: \"kubernetes.io/projected/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-kube-api-access-j4zb5\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.432515 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4zb5\" (UniqueName: \"kubernetes.io/projected/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-kube-api-access-j4zb5\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.432935 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-config-volume\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.432965 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxwjv\" (UniqueName: \"kubernetes.io/projected/2920f68a-c5bb-474c-929b-09ced109bcc0-kube-api-access-xxwjv\") pod \"auto-csr-approver-29553870-5ndrc\" (UID: \"2920f68a-c5bb-474c-929b-09ced109bcc0\") " pod="openshift-infra/auto-csr-approver-29553870-5ndrc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.433094 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-secret-volume\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.434083 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-config-volume\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.440131 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-secret-volume\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.449360 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4zb5\" (UniqueName: \"kubernetes.io/projected/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-kube-api-access-j4zb5\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.451115 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxwjv\" (UniqueName: \"kubernetes.io/projected/2920f68a-c5bb-474c-929b-09ced109bcc0-kube-api-access-xxwjv\") pod \"auto-csr-approver-29553870-5ndrc\" (UID: \"2920f68a-c5bb-474c-929b-09ced109bcc0\") " pod="openshift-infra/auto-csr-approver-29553870-5ndrc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.499029 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.515581 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.921769 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j"] Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.967799 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553870-5ndrc"] Mar 11 12:30:00 crc kubenswrapper[4816]: W0311 12:30:00.968169 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2920f68a_c5bb_474c_929b_09ced109bcc0.slice/crio-507b3aeab6926360f8ab859be408d524db892bb97d8bacd4ffcbc7166864a7b2 WatchSource:0}: Error finding container 507b3aeab6926360f8ab859be408d524db892bb97d8bacd4ffcbc7166864a7b2: Status 404 returned error can't find the container with id 507b3aeab6926360f8ab859be408d524db892bb97d8bacd4ffcbc7166864a7b2 Mar 11 12:30:01 crc kubenswrapper[4816]: I0311 12:30:01.533666 4816 generic.go:334] "Generic (PLEG): container finished" podID="af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" containerID="df1d35d17e400d5b7e626c6af7307f8e5a96cbf6b1e197b1b3bcbb3209f59864" exitCode=0 Mar 11 12:30:01 crc kubenswrapper[4816]: I0311 12:30:01.533765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" event={"ID":"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81","Type":"ContainerDied","Data":"df1d35d17e400d5b7e626c6af7307f8e5a96cbf6b1e197b1b3bcbb3209f59864"} Mar 11 12:30:01 crc kubenswrapper[4816]: I0311 12:30:01.533803 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" event={"ID":"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81","Type":"ContainerStarted","Data":"0ed1cd926ec9b86826d9073d68146148020654ae60ad790cb1be73478f1918d9"} Mar 11 12:30:01 crc kubenswrapper[4816]: I0311 12:30:01.535559 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" event={"ID":"2920f68a-c5bb-474c-929b-09ced109bcc0","Type":"ContainerStarted","Data":"507b3aeab6926360f8ab859be408d524db892bb97d8bacd4ffcbc7166864a7b2"} Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.819378 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.873131 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-config-volume\") pod \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.873225 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-secret-volume\") pod \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.873341 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4zb5\" (UniqueName: \"kubernetes.io/projected/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-kube-api-access-j4zb5\") pod \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.874115 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-config-volume" (OuterVolumeSpecName: "config-volume") pod "af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" (UID: "af7ed2ea-fc1c-4a1a-bf16-50a9817aac81"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.881054 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" (UID: "af7ed2ea-fc1c-4a1a-bf16-50a9817aac81"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.881087 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-kube-api-access-j4zb5" (OuterVolumeSpecName: "kube-api-access-j4zb5") pod "af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" (UID: "af7ed2ea-fc1c-4a1a-bf16-50a9817aac81"). InnerVolumeSpecName "kube-api-access-j4zb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.974632 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.974896 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4zb5\" (UniqueName: \"kubernetes.io/projected/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-kube-api-access-j4zb5\") on node \"crc\" DevicePath \"\"" Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.974957 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:30:03 crc kubenswrapper[4816]: I0311 12:30:03.551853 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" event={"ID":"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81","Type":"ContainerDied","Data":"0ed1cd926ec9b86826d9073d68146148020654ae60ad790cb1be73478f1918d9"} Mar 11 12:30:03 crc kubenswrapper[4816]: I0311 12:30:03.552624 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ed1cd926ec9b86826d9073d68146148020654ae60ad790cb1be73478f1918d9" Mar 11 12:30:03 crc kubenswrapper[4816]: I0311 12:30:03.551894 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:03 crc kubenswrapper[4816]: I0311 12:30:03.553629 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" event={"ID":"2920f68a-c5bb-474c-929b-09ced109bcc0","Type":"ContainerStarted","Data":"921a40cedc2e53aafc505227f80b7caf98ac145dd8c9d234ce2c285d7eb65e19"} Mar 11 12:30:03 crc kubenswrapper[4816]: I0311 12:30:03.863949 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" podStartSLOduration=1.557733046 podStartE2EDuration="3.86390744s" podCreationTimestamp="2026-03-11 12:30:00 +0000 UTC" firstStartedPulling="2026-03-11 12:30:00.970944777 +0000 UTC m=+1887.562208744" lastFinishedPulling="2026-03-11 12:30:03.277119171 +0000 UTC m=+1889.868383138" observedRunningTime="2026-03-11 12:30:03.573684231 +0000 UTC m=+1890.164948218" watchObservedRunningTime="2026-03-11 12:30:03.86390744 +0000 UTC m=+1890.455171407" Mar 11 12:30:04 crc kubenswrapper[4816]: I0311 12:30:04.563156 4816 generic.go:334] "Generic (PLEG): container finished" podID="2920f68a-c5bb-474c-929b-09ced109bcc0" containerID="921a40cedc2e53aafc505227f80b7caf98ac145dd8c9d234ce2c285d7eb65e19" exitCode=0 Mar 11 12:30:04 crc kubenswrapper[4816]: I0311 12:30:04.563224 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" event={"ID":"2920f68a-c5bb-474c-929b-09ced109bcc0","Type":"ContainerDied","Data":"921a40cedc2e53aafc505227f80b7caf98ac145dd8c9d234ce2c285d7eb65e19"} Mar 11 12:30:05 crc kubenswrapper[4816]: I0311 12:30:05.131092 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:30:05 crc kubenswrapper[4816]: E0311 12:30:05.131819 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:30:05 crc kubenswrapper[4816]: I0311 12:30:05.828515 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.021701 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxwjv\" (UniqueName: \"kubernetes.io/projected/2920f68a-c5bb-474c-929b-09ced109bcc0-kube-api-access-xxwjv\") pod \"2920f68a-c5bb-474c-929b-09ced109bcc0\" (UID: \"2920f68a-c5bb-474c-929b-09ced109bcc0\") " Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.028199 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2920f68a-c5bb-474c-929b-09ced109bcc0-kube-api-access-xxwjv" (OuterVolumeSpecName: "kube-api-access-xxwjv") pod "2920f68a-c5bb-474c-929b-09ced109bcc0" (UID: "2920f68a-c5bb-474c-929b-09ced109bcc0"). InnerVolumeSpecName "kube-api-access-xxwjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.124327 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxwjv\" (UniqueName: \"kubernetes.io/projected/2920f68a-c5bb-474c-929b-09ced109bcc0-kube-api-access-xxwjv\") on node \"crc\" DevicePath \"\"" Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.581639 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" event={"ID":"2920f68a-c5bb-474c-929b-09ced109bcc0","Type":"ContainerDied","Data":"507b3aeab6926360f8ab859be408d524db892bb97d8bacd4ffcbc7166864a7b2"} Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.581774 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="507b3aeab6926360f8ab859be408d524db892bb97d8bacd4ffcbc7166864a7b2" Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.581727 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.629678 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553864-r52vk"] Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.634905 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553864-r52vk"] Mar 11 12:30:08 crc kubenswrapper[4816]: I0311 12:30:08.151539 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1c2a96e-0361-49ae-b1d2-795744511b15" path="/var/lib/kubelet/pods/b1c2a96e-0361-49ae-b1d2-795744511b15/volumes" Mar 11 12:30:18 crc kubenswrapper[4816]: I0311 12:30:18.131232 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:30:18 crc kubenswrapper[4816]: I0311 12:30:18.682071 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"106e2d5f907b914dfe49698bbb91ece73a062b224d2ba46fe31a9e998555b6c9"} Mar 11 12:30:47 crc kubenswrapper[4816]: I0311 12:30:47.349452 4816 scope.go:117] "RemoveContainer" containerID="abd9d62fb0ff8a700d3029ac698637da8b39d38073d7e8f33a437d1d746d66d8" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.147482 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553872-r7v8g"] Mar 11 12:32:00 crc kubenswrapper[4816]: E0311 12:32:00.148529 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2920f68a-c5bb-474c-929b-09ced109bcc0" containerName="oc" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.148549 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2920f68a-c5bb-474c-929b-09ced109bcc0" containerName="oc" Mar 11 12:32:00 crc kubenswrapper[4816]: E0311 12:32:00.148567 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" containerName="collect-profiles" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.148575 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" containerName="collect-profiles" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.148745 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" containerName="collect-profiles" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.148763 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2920f68a-c5bb-474c-929b-09ced109bcc0" containerName="oc" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.149520 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.152106 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.152385 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.152608 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.158163 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553872-r7v8g"] Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.235007 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtkgg\" (UniqueName: \"kubernetes.io/projected/9ea5145c-0d08-4c85-984a-84c7e0820999-kube-api-access-mtkgg\") pod \"auto-csr-approver-29553872-r7v8g\" (UID: \"9ea5145c-0d08-4c85-984a-84c7e0820999\") " pod="openshift-infra/auto-csr-approver-29553872-r7v8g" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.336354 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtkgg\" (UniqueName: \"kubernetes.io/projected/9ea5145c-0d08-4c85-984a-84c7e0820999-kube-api-access-mtkgg\") pod \"auto-csr-approver-29553872-r7v8g\" (UID: \"9ea5145c-0d08-4c85-984a-84c7e0820999\") " pod="openshift-infra/auto-csr-approver-29553872-r7v8g" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.358158 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtkgg\" (UniqueName: \"kubernetes.io/projected/9ea5145c-0d08-4c85-984a-84c7e0820999-kube-api-access-mtkgg\") pod \"auto-csr-approver-29553872-r7v8g\" (UID: \"9ea5145c-0d08-4c85-984a-84c7e0820999\") " pod="openshift-infra/auto-csr-approver-29553872-r7v8g" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.468317 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" Mar 11 12:32:01 crc kubenswrapper[4816]: I0311 12:32:01.015106 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553872-r7v8g"] Mar 11 12:32:01 crc kubenswrapper[4816]: I0311 12:32:01.580450 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" event={"ID":"9ea5145c-0d08-4c85-984a-84c7e0820999","Type":"ContainerStarted","Data":"132c71556380e7ecbf9e9d5efbb6e1f1c231855b72c0b3699d9d7dc11b983ca1"} Mar 11 12:32:02 crc kubenswrapper[4816]: I0311 12:32:02.588726 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" event={"ID":"9ea5145c-0d08-4c85-984a-84c7e0820999","Type":"ContainerStarted","Data":"2595726d36e0a1d13282e231fa75cc95c6cd459575385b61b5632d40de6eac9f"} Mar 11 12:32:02 crc kubenswrapper[4816]: I0311 12:32:02.608807 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" podStartSLOduration=1.357092295 podStartE2EDuration="2.60878226s" podCreationTimestamp="2026-03-11 12:32:00 +0000 UTC" firstStartedPulling="2026-03-11 12:32:01.027187349 +0000 UTC m=+2007.618451316" lastFinishedPulling="2026-03-11 12:32:02.278877304 +0000 UTC m=+2008.870141281" observedRunningTime="2026-03-11 12:32:02.601338732 +0000 UTC m=+2009.192602719" watchObservedRunningTime="2026-03-11 12:32:02.60878226 +0000 UTC m=+2009.200046227" Mar 11 12:32:03 crc kubenswrapper[4816]: I0311 12:32:03.602999 4816 generic.go:334] "Generic (PLEG): container finished" podID="9ea5145c-0d08-4c85-984a-84c7e0820999" containerID="2595726d36e0a1d13282e231fa75cc95c6cd459575385b61b5632d40de6eac9f" exitCode=0 Mar 11 12:32:03 crc kubenswrapper[4816]: I0311 12:32:03.603082 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" event={"ID":"9ea5145c-0d08-4c85-984a-84c7e0820999","Type":"ContainerDied","Data":"2595726d36e0a1d13282e231fa75cc95c6cd459575385b61b5632d40de6eac9f"} Mar 11 12:32:04 crc kubenswrapper[4816]: I0311 12:32:04.925809 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.109381 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtkgg\" (UniqueName: \"kubernetes.io/projected/9ea5145c-0d08-4c85-984a-84c7e0820999-kube-api-access-mtkgg\") pod \"9ea5145c-0d08-4c85-984a-84c7e0820999\" (UID: \"9ea5145c-0d08-4c85-984a-84c7e0820999\") " Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.118385 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ea5145c-0d08-4c85-984a-84c7e0820999-kube-api-access-mtkgg" (OuterVolumeSpecName: "kube-api-access-mtkgg") pod "9ea5145c-0d08-4c85-984a-84c7e0820999" (UID: "9ea5145c-0d08-4c85-984a-84c7e0820999"). InnerVolumeSpecName "kube-api-access-mtkgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.211315 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtkgg\" (UniqueName: \"kubernetes.io/projected/9ea5145c-0d08-4c85-984a-84c7e0820999-kube-api-access-mtkgg\") on node \"crc\" DevicePath \"\"" Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.621726 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" event={"ID":"9ea5145c-0d08-4c85-984a-84c7e0820999","Type":"ContainerDied","Data":"132c71556380e7ecbf9e9d5efbb6e1f1c231855b72c0b3699d9d7dc11b983ca1"} Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.621778 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.621790 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="132c71556380e7ecbf9e9d5efbb6e1f1c231855b72c0b3699d9d7dc11b983ca1" Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.686696 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553866-w7rtm"] Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.691617 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553866-w7rtm"] Mar 11 12:32:06 crc kubenswrapper[4816]: I0311 12:32:06.141140 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94280e97-4b7e-4a2d-8f1f-c3125f5910bc" path="/var/lib/kubelet/pods/94280e97-4b7e-4a2d-8f1f-c3125f5910bc/volumes" Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.726736 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mhflg"] Mar 11 12:32:37 crc kubenswrapper[4816]: E0311 12:32:37.728285 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea5145c-0d08-4c85-984a-84c7e0820999" containerName="oc" Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.728310 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea5145c-0d08-4c85-984a-84c7e0820999" containerName="oc" Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.728555 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ea5145c-0d08-4c85-984a-84c7e0820999" containerName="oc" Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.730108 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.748323 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mhflg"] Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.903398 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-utilities\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.903492 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwj95\" (UniqueName: \"kubernetes.io/projected/589c0cba-36d0-4224-ab81-dfcef6906331-kube-api-access-gwj95\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.903531 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-catalog-content\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.005682 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-catalog-content\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.005996 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-utilities\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.006050 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwj95\" (UniqueName: \"kubernetes.io/projected/589c0cba-36d0-4224-ab81-dfcef6906331-kube-api-access-gwj95\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.006460 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-catalog-content\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.006600 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-utilities\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.040340 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwj95\" (UniqueName: \"kubernetes.io/projected/589c0cba-36d0-4224-ab81-dfcef6906331-kube-api-access-gwj95\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.058332 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.651783 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mhflg"] Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.891393 4816 generic.go:334] "Generic (PLEG): container finished" podID="589c0cba-36d0-4224-ab81-dfcef6906331" containerID="3aa7c0e071b2d12f65dd8afa57fdda72f32fe235d19f86d4821e81596b958166" exitCode=0 Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.891447 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhflg" event={"ID":"589c0cba-36d0-4224-ab81-dfcef6906331","Type":"ContainerDied","Data":"3aa7c0e071b2d12f65dd8afa57fdda72f32fe235d19f86d4821e81596b958166"} Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.891479 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhflg" event={"ID":"589c0cba-36d0-4224-ab81-dfcef6906331","Type":"ContainerStarted","Data":"7600999d30e2047fb4c28c092d759410853359befac2bdcbf26a0353c15bbbaf"} Mar 11 12:32:39 crc kubenswrapper[4816]: I0311 12:32:39.514706 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:32:39 crc kubenswrapper[4816]: I0311 12:32:39.515072 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:32:39 crc kubenswrapper[4816]: I0311 12:32:39.902432 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhflg" event={"ID":"589c0cba-36d0-4224-ab81-dfcef6906331","Type":"ContainerStarted","Data":"d7bbfc360686ecb08719d6c8765ab786422bed05f456eaff1e04e0e820cb0cad"} Mar 11 12:32:40 crc kubenswrapper[4816]: I0311 12:32:40.910295 4816 generic.go:334] "Generic (PLEG): container finished" podID="589c0cba-36d0-4224-ab81-dfcef6906331" containerID="d7bbfc360686ecb08719d6c8765ab786422bed05f456eaff1e04e0e820cb0cad" exitCode=0 Mar 11 12:32:40 crc kubenswrapper[4816]: I0311 12:32:40.910433 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhflg" event={"ID":"589c0cba-36d0-4224-ab81-dfcef6906331","Type":"ContainerDied","Data":"d7bbfc360686ecb08719d6c8765ab786422bed05f456eaff1e04e0e820cb0cad"} Mar 11 12:32:41 crc kubenswrapper[4816]: I0311 12:32:41.921178 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhflg" event={"ID":"589c0cba-36d0-4224-ab81-dfcef6906331","Type":"ContainerStarted","Data":"41b9a433ab68895783b085e7821f621ee0e83d4fb07214cce70a87e38938a02b"} Mar 11 12:32:41 crc kubenswrapper[4816]: I0311 12:32:41.948481 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mhflg" podStartSLOduration=2.472274317 podStartE2EDuration="4.948451777s" podCreationTimestamp="2026-03-11 12:32:37 +0000 UTC" firstStartedPulling="2026-03-11 12:32:38.893232812 +0000 UTC m=+2045.484496779" lastFinishedPulling="2026-03-11 12:32:41.369410252 +0000 UTC m=+2047.960674239" observedRunningTime="2026-03-11 12:32:41.938674964 +0000 UTC m=+2048.529938941" watchObservedRunningTime="2026-03-11 12:32:41.948451777 +0000 UTC m=+2048.539715754" Mar 11 12:32:47 crc kubenswrapper[4816]: I0311 12:32:47.425820 4816 scope.go:117] "RemoveContainer" containerID="b6abd0eb57a8c7686ba5edd206e023c171a5ca0fee490573bab4f684459e8043" Mar 11 12:32:48 crc kubenswrapper[4816]: I0311 12:32:48.058769 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:48 crc kubenswrapper[4816]: I0311 12:32:48.058827 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:48 crc kubenswrapper[4816]: I0311 12:32:48.098317 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:49 crc kubenswrapper[4816]: I0311 12:32:49.043997 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:49 crc kubenswrapper[4816]: I0311 12:32:49.107500 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mhflg"] Mar 11 12:32:51 crc kubenswrapper[4816]: I0311 12:32:51.016435 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mhflg" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="registry-server" containerID="cri-o://41b9a433ab68895783b085e7821f621ee0e83d4fb07214cce70a87e38938a02b" gracePeriod=2 Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.027076 4816 generic.go:334] "Generic (PLEG): container finished" podID="589c0cba-36d0-4224-ab81-dfcef6906331" containerID="41b9a433ab68895783b085e7821f621ee0e83d4fb07214cce70a87e38938a02b" exitCode=0 Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.027149 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhflg" event={"ID":"589c0cba-36d0-4224-ab81-dfcef6906331","Type":"ContainerDied","Data":"41b9a433ab68895783b085e7821f621ee0e83d4fb07214cce70a87e38938a02b"} Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.510654 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.654203 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-catalog-content\") pod \"589c0cba-36d0-4224-ab81-dfcef6906331\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.655054 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-utilities\") pod \"589c0cba-36d0-4224-ab81-dfcef6906331\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.655168 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwj95\" (UniqueName: \"kubernetes.io/projected/589c0cba-36d0-4224-ab81-dfcef6906331-kube-api-access-gwj95\") pod \"589c0cba-36d0-4224-ab81-dfcef6906331\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.656414 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-utilities" (OuterVolumeSpecName: "utilities") pod "589c0cba-36d0-4224-ab81-dfcef6906331" (UID: "589c0cba-36d0-4224-ab81-dfcef6906331"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.661320 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/589c0cba-36d0-4224-ab81-dfcef6906331-kube-api-access-gwj95" (OuterVolumeSpecName: "kube-api-access-gwj95") pod "589c0cba-36d0-4224-ab81-dfcef6906331" (UID: "589c0cba-36d0-4224-ab81-dfcef6906331"). InnerVolumeSpecName "kube-api-access-gwj95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.756738 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.756788 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwj95\" (UniqueName: \"kubernetes.io/projected/589c0cba-36d0-4224-ab81-dfcef6906331-kube-api-access-gwj95\") on node \"crc\" DevicePath \"\"" Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.821106 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "589c0cba-36d0-4224-ab81-dfcef6906331" (UID: "589c0cba-36d0-4224-ab81-dfcef6906331"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.857876 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:32:53 crc kubenswrapper[4816]: I0311 12:32:53.040186 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhflg" event={"ID":"589c0cba-36d0-4224-ab81-dfcef6906331","Type":"ContainerDied","Data":"7600999d30e2047fb4c28c092d759410853359befac2bdcbf26a0353c15bbbaf"} Mar 11 12:32:53 crc kubenswrapper[4816]: I0311 12:32:53.040223 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:53 crc kubenswrapper[4816]: I0311 12:32:53.040279 4816 scope.go:117] "RemoveContainer" containerID="41b9a433ab68895783b085e7821f621ee0e83d4fb07214cce70a87e38938a02b" Mar 11 12:32:53 crc kubenswrapper[4816]: I0311 12:32:53.062707 4816 scope.go:117] "RemoveContainer" containerID="d7bbfc360686ecb08719d6c8765ab786422bed05f456eaff1e04e0e820cb0cad" Mar 11 12:32:53 crc kubenswrapper[4816]: I0311 12:32:53.085122 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mhflg"] Mar 11 12:32:53 crc kubenswrapper[4816]: I0311 12:32:53.095823 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mhflg"] Mar 11 12:32:53 crc kubenswrapper[4816]: I0311 12:32:53.100360 4816 scope.go:117] "RemoveContainer" containerID="3aa7c0e071b2d12f65dd8afa57fdda72f32fe235d19f86d4821e81596b958166" Mar 11 12:32:54 crc kubenswrapper[4816]: I0311 12:32:54.143183 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" path="/var/lib/kubelet/pods/589c0cba-36d0-4224-ab81-dfcef6906331/volumes" Mar 11 12:33:09 crc kubenswrapper[4816]: I0311 12:33:09.515502 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:33:09 crc kubenswrapper[4816]: I0311 12:33:09.516039 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:33:13 crc kubenswrapper[4816]: I0311 12:33:13.967304 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r2pkt"] Mar 11 12:33:13 crc kubenswrapper[4816]: E0311 12:33:13.968821 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="extract-utilities" Mar 11 12:33:13 crc kubenswrapper[4816]: I0311 12:33:13.968862 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="extract-utilities" Mar 11 12:33:13 crc kubenswrapper[4816]: E0311 12:33:13.968913 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="registry-server" Mar 11 12:33:13 crc kubenswrapper[4816]: I0311 12:33:13.968922 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="registry-server" Mar 11 12:33:13 crc kubenswrapper[4816]: E0311 12:33:13.968946 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="extract-content" Mar 11 12:33:13 crc kubenswrapper[4816]: I0311 12:33:13.968954 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="extract-content" Mar 11 12:33:13 crc kubenswrapper[4816]: I0311 12:33:13.969274 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="registry-server" Mar 11 12:33:13 crc kubenswrapper[4816]: I0311 12:33:13.970672 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:13 crc kubenswrapper[4816]: I0311 12:33:13.984801 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r2pkt"] Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.096386 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkk4w\" (UniqueName: \"kubernetes.io/projected/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-kube-api-access-gkk4w\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.096572 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-catalog-content\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.096770 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-utilities\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.198016 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-utilities\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.198143 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkk4w\" (UniqueName: \"kubernetes.io/projected/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-kube-api-access-gkk4w\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.198166 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-catalog-content\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.198740 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-catalog-content\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.198832 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-utilities\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.218895 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkk4w\" (UniqueName: \"kubernetes.io/projected/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-kube-api-access-gkk4w\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.303571 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.599699 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r2pkt"] Mar 11 12:33:15 crc kubenswrapper[4816]: I0311 12:33:15.258016 4816 generic.go:334] "Generic (PLEG): container finished" podID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerID="8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44" exitCode=0 Mar 11 12:33:15 crc kubenswrapper[4816]: I0311 12:33:15.258130 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2pkt" event={"ID":"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1","Type":"ContainerDied","Data":"8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44"} Mar 11 12:33:15 crc kubenswrapper[4816]: I0311 12:33:15.258439 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2pkt" event={"ID":"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1","Type":"ContainerStarted","Data":"0033fa1237cf2aca43feef20e61c599f42d60351aa258570a9bcddf0b7f1affe"} Mar 11 12:33:15 crc kubenswrapper[4816]: I0311 12:33:15.261198 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:33:16 crc kubenswrapper[4816]: I0311 12:33:16.267608 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2pkt" event={"ID":"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1","Type":"ContainerStarted","Data":"c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b"} Mar 11 12:33:17 crc kubenswrapper[4816]: I0311 12:33:17.286384 4816 generic.go:334] "Generic (PLEG): container finished" podID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerID="c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b" exitCode=0 Mar 11 12:33:17 crc kubenswrapper[4816]: I0311 12:33:17.286463 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2pkt" event={"ID":"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1","Type":"ContainerDied","Data":"c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b"} Mar 11 12:33:18 crc kubenswrapper[4816]: I0311 12:33:18.297922 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2pkt" event={"ID":"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1","Type":"ContainerStarted","Data":"7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8"} Mar 11 12:33:18 crc kubenswrapper[4816]: I0311 12:33:18.321665 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r2pkt" podStartSLOduration=2.822032192 podStartE2EDuration="5.321640546s" podCreationTimestamp="2026-03-11 12:33:13 +0000 UTC" firstStartedPulling="2026-03-11 12:33:15.260772013 +0000 UTC m=+2081.852035990" lastFinishedPulling="2026-03-11 12:33:17.760380377 +0000 UTC m=+2084.351644344" observedRunningTime="2026-03-11 12:33:18.316363039 +0000 UTC m=+2084.907627016" watchObservedRunningTime="2026-03-11 12:33:18.321640546 +0000 UTC m=+2084.912904513" Mar 11 12:33:24 crc kubenswrapper[4816]: I0311 12:33:24.304166 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:24 crc kubenswrapper[4816]: I0311 12:33:24.304721 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:24 crc kubenswrapper[4816]: I0311 12:33:24.369218 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:24 crc kubenswrapper[4816]: I0311 12:33:24.414451 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:24 crc kubenswrapper[4816]: I0311 12:33:24.607604 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r2pkt"] Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.356644 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r2pkt" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="registry-server" containerID="cri-o://7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8" gracePeriod=2 Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.731932 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.801324 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkk4w\" (UniqueName: \"kubernetes.io/projected/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-kube-api-access-gkk4w\") pod \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.801468 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-utilities\") pod \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.801550 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-catalog-content\") pod \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.802405 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-utilities" (OuterVolumeSpecName: "utilities") pod "fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" (UID: "fabbe121-d60e-4b29-8f0b-cd0e8fce41c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.809541 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-kube-api-access-gkk4w" (OuterVolumeSpecName: "kube-api-access-gkk4w") pod "fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" (UID: "fabbe121-d60e-4b29-8f0b-cd0e8fce41c1"). InnerVolumeSpecName "kube-api-access-gkk4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.863148 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" (UID: "fabbe121-d60e-4b29-8f0b-cd0e8fce41c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.903502 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.903549 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkk4w\" (UniqueName: \"kubernetes.io/projected/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-kube-api-access-gkk4w\") on node \"crc\" DevicePath \"\"" Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.903570 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.365959 4816 generic.go:334] "Generic (PLEG): container finished" podID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerID="7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8" exitCode=0 Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.366010 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2pkt" event={"ID":"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1","Type":"ContainerDied","Data":"7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8"} Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.366044 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2pkt" event={"ID":"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1","Type":"ContainerDied","Data":"0033fa1237cf2aca43feef20e61c599f42d60351aa258570a9bcddf0b7f1affe"} Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.366045 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.366082 4816 scope.go:117] "RemoveContainer" containerID="7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.384962 4816 scope.go:117] "RemoveContainer" containerID="c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.400761 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r2pkt"] Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.405514 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r2pkt"] Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.417964 4816 scope.go:117] "RemoveContainer" containerID="8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.440995 4816 scope.go:117] "RemoveContainer" containerID="7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8" Mar 11 12:33:27 crc kubenswrapper[4816]: E0311 12:33:27.441623 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8\": container with ID starting with 7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8 not found: ID does not exist" containerID="7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.441696 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8"} err="failed to get container status \"7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8\": rpc error: code = NotFound desc = could not find container \"7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8\": container with ID starting with 7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8 not found: ID does not exist" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.441736 4816 scope.go:117] "RemoveContainer" containerID="c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b" Mar 11 12:33:27 crc kubenswrapper[4816]: E0311 12:33:27.442166 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b\": container with ID starting with c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b not found: ID does not exist" containerID="c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.442300 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b"} err="failed to get container status \"c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b\": rpc error: code = NotFound desc = could not find container \"c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b\": container with ID starting with c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b not found: ID does not exist" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.442392 4816 scope.go:117] "RemoveContainer" containerID="8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44" Mar 11 12:33:27 crc kubenswrapper[4816]: E0311 12:33:27.442731 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44\": container with ID starting with 8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44 not found: ID does not exist" containerID="8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.442760 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44"} err="failed to get container status \"8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44\": rpc error: code = NotFound desc = could not find container \"8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44\": container with ID starting with 8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44 not found: ID does not exist" Mar 11 12:33:28 crc kubenswrapper[4816]: I0311 12:33:28.141926 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" path="/var/lib/kubelet/pods/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1/volumes" Mar 11 12:33:39 crc kubenswrapper[4816]: I0311 12:33:39.515431 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:33:39 crc kubenswrapper[4816]: I0311 12:33:39.516190 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:33:39 crc kubenswrapper[4816]: I0311 12:33:39.516323 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:33:39 crc kubenswrapper[4816]: I0311 12:33:39.517360 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"106e2d5f907b914dfe49698bbb91ece73a062b224d2ba46fe31a9e998555b6c9"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:33:39 crc kubenswrapper[4816]: I0311 12:33:39.517474 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://106e2d5f907b914dfe49698bbb91ece73a062b224d2ba46fe31a9e998555b6c9" gracePeriod=600 Mar 11 12:33:40 crc kubenswrapper[4816]: I0311 12:33:40.480703 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="106e2d5f907b914dfe49698bbb91ece73a062b224d2ba46fe31a9e998555b6c9" exitCode=0 Mar 11 12:33:40 crc kubenswrapper[4816]: I0311 12:33:40.480806 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"106e2d5f907b914dfe49698bbb91ece73a062b224d2ba46fe31a9e998555b6c9"} Mar 11 12:33:40 crc kubenswrapper[4816]: I0311 12:33:40.481092 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad"} Mar 11 12:33:40 crc kubenswrapper[4816]: I0311 12:33:40.481121 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:33:55 crc kubenswrapper[4816]: I0311 12:33:55.973039 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7qdkl"] Mar 11 12:33:55 crc kubenswrapper[4816]: E0311 12:33:55.973930 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="extract-utilities" Mar 11 12:33:55 crc kubenswrapper[4816]: I0311 12:33:55.973946 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="extract-utilities" Mar 11 12:33:55 crc kubenswrapper[4816]: E0311 12:33:55.973958 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="registry-server" Mar 11 12:33:55 crc kubenswrapper[4816]: I0311 12:33:55.973964 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="registry-server" Mar 11 12:33:55 crc kubenswrapper[4816]: E0311 12:33:55.973982 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="extract-content" Mar 11 12:33:55 crc kubenswrapper[4816]: I0311 12:33:55.973990 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="extract-content" Mar 11 12:33:55 crc kubenswrapper[4816]: I0311 12:33:55.974147 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="registry-server" Mar 11 12:33:55 crc kubenswrapper[4816]: I0311 12:33:55.975203 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:55 crc kubenswrapper[4816]: I0311 12:33:55.989833 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7qdkl"] Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.072529 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmd5q\" (UniqueName: \"kubernetes.io/projected/42226e5d-abc0-4f65-a104-31582febe5fb-kube-api-access-mmd5q\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.072679 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-catalog-content\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.072723 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-utilities\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.173839 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-catalog-content\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.173920 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-utilities\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.173956 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmd5q\" (UniqueName: \"kubernetes.io/projected/42226e5d-abc0-4f65-a104-31582febe5fb-kube-api-access-mmd5q\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.174461 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-catalog-content\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.174586 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-utilities\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.192799 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmd5q\" (UniqueName: \"kubernetes.io/projected/42226e5d-abc0-4f65-a104-31582febe5fb-kube-api-access-mmd5q\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.332810 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.807107 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7qdkl"] Mar 11 12:33:56 crc kubenswrapper[4816]: W0311 12:33:56.819487 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42226e5d_abc0_4f65_a104_31582febe5fb.slice/crio-0a3e1b9bdac2862d2e386afd65a93928dc918f71af0c866ccae7ff804dfca7d3 WatchSource:0}: Error finding container 0a3e1b9bdac2862d2e386afd65a93928dc918f71af0c866ccae7ff804dfca7d3: Status 404 returned error can't find the container with id 0a3e1b9bdac2862d2e386afd65a93928dc918f71af0c866ccae7ff804dfca7d3 Mar 11 12:33:57 crc kubenswrapper[4816]: I0311 12:33:57.630778 4816 generic.go:334] "Generic (PLEG): container finished" podID="42226e5d-abc0-4f65-a104-31582febe5fb" containerID="8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b" exitCode=0 Mar 11 12:33:57 crc kubenswrapper[4816]: I0311 12:33:57.630889 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qdkl" event={"ID":"42226e5d-abc0-4f65-a104-31582febe5fb","Type":"ContainerDied","Data":"8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b"} Mar 11 12:33:57 crc kubenswrapper[4816]: I0311 12:33:57.631234 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qdkl" event={"ID":"42226e5d-abc0-4f65-a104-31582febe5fb","Type":"ContainerStarted","Data":"0a3e1b9bdac2862d2e386afd65a93928dc918f71af0c866ccae7ff804dfca7d3"} Mar 11 12:33:58 crc kubenswrapper[4816]: I0311 12:33:58.642711 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qdkl" event={"ID":"42226e5d-abc0-4f65-a104-31582febe5fb","Type":"ContainerStarted","Data":"43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155"} Mar 11 12:33:59 crc kubenswrapper[4816]: I0311 12:33:59.653096 4816 generic.go:334] "Generic (PLEG): container finished" podID="42226e5d-abc0-4f65-a104-31582febe5fb" containerID="43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155" exitCode=0 Mar 11 12:33:59 crc kubenswrapper[4816]: I0311 12:33:59.653160 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qdkl" event={"ID":"42226e5d-abc0-4f65-a104-31582febe5fb","Type":"ContainerDied","Data":"43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155"} Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.147597 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553874-gxdm8"] Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.149295 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.152193 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.152410 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.152587 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.154081 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553874-gxdm8"] Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.240511 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7gx8\" (UniqueName: \"kubernetes.io/projected/37fd63f9-b7c1-4900-a6c1-269f771958b1-kube-api-access-g7gx8\") pod \"auto-csr-approver-29553874-gxdm8\" (UID: \"37fd63f9-b7c1-4900-a6c1-269f771958b1\") " pod="openshift-infra/auto-csr-approver-29553874-gxdm8" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.342006 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7gx8\" (UniqueName: \"kubernetes.io/projected/37fd63f9-b7c1-4900-a6c1-269f771958b1-kube-api-access-g7gx8\") pod \"auto-csr-approver-29553874-gxdm8\" (UID: \"37fd63f9-b7c1-4900-a6c1-269f771958b1\") " pod="openshift-infra/auto-csr-approver-29553874-gxdm8" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.363710 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7gx8\" (UniqueName: \"kubernetes.io/projected/37fd63f9-b7c1-4900-a6c1-269f771958b1-kube-api-access-g7gx8\") pod \"auto-csr-approver-29553874-gxdm8\" (UID: \"37fd63f9-b7c1-4900-a6c1-269f771958b1\") " pod="openshift-infra/auto-csr-approver-29553874-gxdm8" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.477154 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.666350 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qdkl" event={"ID":"42226e5d-abc0-4f65-a104-31582febe5fb","Type":"ContainerStarted","Data":"44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e"} Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.693328 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7qdkl" podStartSLOduration=3.032208965 podStartE2EDuration="5.693228998s" podCreationTimestamp="2026-03-11 12:33:55 +0000 UTC" firstStartedPulling="2026-03-11 12:33:57.632960182 +0000 UTC m=+2124.224224149" lastFinishedPulling="2026-03-11 12:34:00.293980215 +0000 UTC m=+2126.885244182" observedRunningTime="2026-03-11 12:34:00.68647409 +0000 UTC m=+2127.277738077" watchObservedRunningTime="2026-03-11 12:34:00.693228998 +0000 UTC m=+2127.284492965" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.891401 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553874-gxdm8"] Mar 11 12:34:00 crc kubenswrapper[4816]: W0311 12:34:00.893959 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37fd63f9_b7c1_4900_a6c1_269f771958b1.slice/crio-e2e8965efc63e031283f92ec8887043b338e778752affced077e6e1d457d93b0 WatchSource:0}: Error finding container e2e8965efc63e031283f92ec8887043b338e778752affced077e6e1d457d93b0: Status 404 returned error can't find the container with id e2e8965efc63e031283f92ec8887043b338e778752affced077e6e1d457d93b0 Mar 11 12:34:01 crc kubenswrapper[4816]: I0311 12:34:01.681596 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" event={"ID":"37fd63f9-b7c1-4900-a6c1-269f771958b1","Type":"ContainerStarted","Data":"e2e8965efc63e031283f92ec8887043b338e778752affced077e6e1d457d93b0"} Mar 11 12:34:02 crc kubenswrapper[4816]: I0311 12:34:02.690984 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" event={"ID":"37fd63f9-b7c1-4900-a6c1-269f771958b1","Type":"ContainerStarted","Data":"372011e32b19c2edb6c819b3e68c03b814d1f5c3bd84d6d127863a08cf21f878"} Mar 11 12:34:03 crc kubenswrapper[4816]: I0311 12:34:03.702048 4816 generic.go:334] "Generic (PLEG): container finished" podID="37fd63f9-b7c1-4900-a6c1-269f771958b1" containerID="372011e32b19c2edb6c819b3e68c03b814d1f5c3bd84d6d127863a08cf21f878" exitCode=0 Mar 11 12:34:03 crc kubenswrapper[4816]: I0311 12:34:03.702118 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" event={"ID":"37fd63f9-b7c1-4900-a6c1-269f771958b1","Type":"ContainerDied","Data":"372011e32b19c2edb6c819b3e68c03b814d1f5c3bd84d6d127863a08cf21f878"} Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.021440 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.120824 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7gx8\" (UniqueName: \"kubernetes.io/projected/37fd63f9-b7c1-4900-a6c1-269f771958b1-kube-api-access-g7gx8\") pod \"37fd63f9-b7c1-4900-a6c1-269f771958b1\" (UID: \"37fd63f9-b7c1-4900-a6c1-269f771958b1\") " Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.136762 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37fd63f9-b7c1-4900-a6c1-269f771958b1-kube-api-access-g7gx8" (OuterVolumeSpecName: "kube-api-access-g7gx8") pod "37fd63f9-b7c1-4900-a6c1-269f771958b1" (UID: "37fd63f9-b7c1-4900-a6c1-269f771958b1"). InnerVolumeSpecName "kube-api-access-g7gx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.224756 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7gx8\" (UniqueName: \"kubernetes.io/projected/37fd63f9-b7c1-4900-a6c1-269f771958b1-kube-api-access-g7gx8\") on node \"crc\" DevicePath \"\"" Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.720932 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" event={"ID":"37fd63f9-b7c1-4900-a6c1-269f771958b1","Type":"ContainerDied","Data":"e2e8965efc63e031283f92ec8887043b338e778752affced077e6e1d457d93b0"} Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.720982 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2e8965efc63e031283f92ec8887043b338e778752affced077e6e1d457d93b0" Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.721026 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.787992 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553868-qmhvt"] Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.794863 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553868-qmhvt"] Mar 11 12:34:06 crc kubenswrapper[4816]: I0311 12:34:06.141272 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf" path="/var/lib/kubelet/pods/7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf/volumes" Mar 11 12:34:06 crc kubenswrapper[4816]: I0311 12:34:06.333574 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:34:06 crc kubenswrapper[4816]: I0311 12:34:06.333673 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:34:06 crc kubenswrapper[4816]: I0311 12:34:06.377218 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:34:06 crc kubenswrapper[4816]: I0311 12:34:06.771895 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:34:06 crc kubenswrapper[4816]: I0311 12:34:06.823408 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7qdkl"] Mar 11 12:34:08 crc kubenswrapper[4816]: I0311 12:34:08.743437 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7qdkl" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="registry-server" containerID="cri-o://44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e" gracePeriod=2 Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.245378 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.398167 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmd5q\" (UniqueName: \"kubernetes.io/projected/42226e5d-abc0-4f65-a104-31582febe5fb-kube-api-access-mmd5q\") pod \"42226e5d-abc0-4f65-a104-31582febe5fb\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.398235 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-catalog-content\") pod \"42226e5d-abc0-4f65-a104-31582febe5fb\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.398297 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-utilities\") pod \"42226e5d-abc0-4f65-a104-31582febe5fb\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.399737 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-utilities" (OuterVolumeSpecName: "utilities") pod "42226e5d-abc0-4f65-a104-31582febe5fb" (UID: "42226e5d-abc0-4f65-a104-31582febe5fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.411600 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42226e5d-abc0-4f65-a104-31582febe5fb-kube-api-access-mmd5q" (OuterVolumeSpecName: "kube-api-access-mmd5q") pod "42226e5d-abc0-4f65-a104-31582febe5fb" (UID: "42226e5d-abc0-4f65-a104-31582febe5fb"). InnerVolumeSpecName "kube-api-access-mmd5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.500751 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmd5q\" (UniqueName: \"kubernetes.io/projected/42226e5d-abc0-4f65-a104-31582febe5fb-kube-api-access-mmd5q\") on node \"crc\" DevicePath \"\"" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.500809 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.512784 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42226e5d-abc0-4f65-a104-31582febe5fb" (UID: "42226e5d-abc0-4f65-a104-31582febe5fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.602083 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.757267 4816 generic.go:334] "Generic (PLEG): container finished" podID="42226e5d-abc0-4f65-a104-31582febe5fb" containerID="44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e" exitCode=0 Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.757352 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.757365 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qdkl" event={"ID":"42226e5d-abc0-4f65-a104-31582febe5fb","Type":"ContainerDied","Data":"44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e"} Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.757483 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qdkl" event={"ID":"42226e5d-abc0-4f65-a104-31582febe5fb","Type":"ContainerDied","Data":"0a3e1b9bdac2862d2e386afd65a93928dc918f71af0c866ccae7ff804dfca7d3"} Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.757528 4816 scope.go:117] "RemoveContainer" containerID="44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.783842 4816 scope.go:117] "RemoveContainer" containerID="43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.801934 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7qdkl"] Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.811468 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7qdkl"] Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.831705 4816 scope.go:117] "RemoveContainer" containerID="8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.863373 4816 scope.go:117] "RemoveContainer" containerID="44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e" Mar 11 12:34:09 crc kubenswrapper[4816]: E0311 12:34:09.864158 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e\": container with ID starting with 44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e not found: ID does not exist" containerID="44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.864301 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e"} err="failed to get container status \"44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e\": rpc error: code = NotFound desc = could not find container \"44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e\": container with ID starting with 44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e not found: ID does not exist" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.864372 4816 scope.go:117] "RemoveContainer" containerID="43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155" Mar 11 12:34:09 crc kubenswrapper[4816]: E0311 12:34:09.865229 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155\": container with ID starting with 43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155 not found: ID does not exist" containerID="43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.865326 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155"} err="failed to get container status \"43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155\": rpc error: code = NotFound desc = could not find container \"43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155\": container with ID starting with 43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155 not found: ID does not exist" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.865374 4816 scope.go:117] "RemoveContainer" containerID="8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b" Mar 11 12:34:09 crc kubenswrapper[4816]: E0311 12:34:09.865793 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b\": container with ID starting with 8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b not found: ID does not exist" containerID="8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.865865 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b"} err="failed to get container status \"8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b\": rpc error: code = NotFound desc = could not find container \"8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b\": container with ID starting with 8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b not found: ID does not exist" Mar 11 12:34:10 crc kubenswrapper[4816]: I0311 12:34:10.140189 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" path="/var/lib/kubelet/pods/42226e5d-abc0-4f65-a104-31582febe5fb/volumes" Mar 11 12:34:47 crc kubenswrapper[4816]: I0311 12:34:47.562819 4816 scope.go:117] "RemoveContainer" containerID="6f896b214f33da369f143727ecfdb3b64f134749ec6e50337a2f62ef03d15c62" Mar 11 12:35:39 crc kubenswrapper[4816]: I0311 12:35:39.518764 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:35:39 crc kubenswrapper[4816]: I0311 12:35:39.519830 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.535433 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hrg8t"] Mar 11 12:35:57 crc kubenswrapper[4816]: E0311 12:35:57.536465 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37fd63f9-b7c1-4900-a6c1-269f771958b1" containerName="oc" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.536486 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="37fd63f9-b7c1-4900-a6c1-269f771958b1" containerName="oc" Mar 11 12:35:57 crc kubenswrapper[4816]: E0311 12:35:57.536518 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="extract-content" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.536529 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="extract-content" Mar 11 12:35:57 crc kubenswrapper[4816]: E0311 12:35:57.536544 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="registry-server" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.536553 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="registry-server" Mar 11 12:35:57 crc kubenswrapper[4816]: E0311 12:35:57.536564 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="extract-utilities" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.536573 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="extract-utilities" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.536735 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="registry-server" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.536758 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="37fd63f9-b7c1-4900-a6c1-269f771958b1" containerName="oc" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.538336 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.553013 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrg8t"] Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.676628 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkhds\" (UniqueName: \"kubernetes.io/projected/16cec869-9798-4a51-b950-59a57dfa3c37-kube-api-access-kkhds\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.676679 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-catalog-content\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.676797 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-utilities\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.778042 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-utilities\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.778186 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkhds\" (UniqueName: \"kubernetes.io/projected/16cec869-9798-4a51-b950-59a57dfa3c37-kube-api-access-kkhds\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.778215 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-catalog-content\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.778702 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-utilities\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.778753 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-catalog-content\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.802664 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkhds\" (UniqueName: \"kubernetes.io/projected/16cec869-9798-4a51-b950-59a57dfa3c37-kube-api-access-kkhds\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.875453 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:58 crc kubenswrapper[4816]: I0311 12:35:58.307193 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrg8t"] Mar 11 12:35:58 crc kubenswrapper[4816]: I0311 12:35:58.673098 4816 generic.go:334] "Generic (PLEG): container finished" podID="16cec869-9798-4a51-b950-59a57dfa3c37" containerID="f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0" exitCode=0 Mar 11 12:35:58 crc kubenswrapper[4816]: I0311 12:35:58.673562 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrg8t" event={"ID":"16cec869-9798-4a51-b950-59a57dfa3c37","Type":"ContainerDied","Data":"f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0"} Mar 11 12:35:58 crc kubenswrapper[4816]: I0311 12:35:58.673618 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrg8t" event={"ID":"16cec869-9798-4a51-b950-59a57dfa3c37","Type":"ContainerStarted","Data":"a088adcbed3f980d1c239b585dfbe0befdd15b0a6eaae124e57dfe197c46e993"} Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.150818 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553876-55nmf"] Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.152822 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553876-55nmf" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.157191 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553876-55nmf"] Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.176526 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.176883 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.177000 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.315758 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l44dc\" (UniqueName: \"kubernetes.io/projected/feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f-kube-api-access-l44dc\") pod \"auto-csr-approver-29553876-55nmf\" (UID: \"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f\") " pod="openshift-infra/auto-csr-approver-29553876-55nmf" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.418951 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l44dc\" (UniqueName: \"kubernetes.io/projected/feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f-kube-api-access-l44dc\") pod \"auto-csr-approver-29553876-55nmf\" (UID: \"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f\") " pod="openshift-infra/auto-csr-approver-29553876-55nmf" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.445169 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l44dc\" (UniqueName: \"kubernetes.io/projected/feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f-kube-api-access-l44dc\") pod \"auto-csr-approver-29553876-55nmf\" (UID: \"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f\") " pod="openshift-infra/auto-csr-approver-29553876-55nmf" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.536999 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553876-55nmf" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.698840 4816 generic.go:334] "Generic (PLEG): container finished" podID="16cec869-9798-4a51-b950-59a57dfa3c37" containerID="0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c" exitCode=0 Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.699596 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrg8t" event={"ID":"16cec869-9798-4a51-b950-59a57dfa3c37","Type":"ContainerDied","Data":"0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c"} Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.792033 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553876-55nmf"] Mar 11 12:36:01 crc kubenswrapper[4816]: I0311 12:36:01.709915 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrg8t" event={"ID":"16cec869-9798-4a51-b950-59a57dfa3c37","Type":"ContainerStarted","Data":"16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa"} Mar 11 12:36:01 crc kubenswrapper[4816]: I0311 12:36:01.715438 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553876-55nmf" event={"ID":"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f","Type":"ContainerStarted","Data":"4e35de0190418478e45846aa2c23f6f00d64b785f24078cec25aba7b5c721f35"} Mar 11 12:36:01 crc kubenswrapper[4816]: I0311 12:36:01.743144 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hrg8t" podStartSLOduration=2.214650745 podStartE2EDuration="4.743116696s" podCreationTimestamp="2026-03-11 12:35:57 +0000 UTC" firstStartedPulling="2026-03-11 12:35:58.675167204 +0000 UTC m=+2245.266431171" lastFinishedPulling="2026-03-11 12:36:01.203633155 +0000 UTC m=+2247.794897122" observedRunningTime="2026-03-11 12:36:01.734008501 +0000 UTC m=+2248.325272488" watchObservedRunningTime="2026-03-11 12:36:01.743116696 +0000 UTC m=+2248.334380683" Mar 11 12:36:02 crc kubenswrapper[4816]: I0311 12:36:02.725158 4816 generic.go:334] "Generic (PLEG): container finished" podID="feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f" containerID="fcc96a304b12ffc267c89c3d4b3f056b4d2e01821a0ebbb16c1bdcf350072143" exitCode=0 Mar 11 12:36:02 crc kubenswrapper[4816]: I0311 12:36:02.725670 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553876-55nmf" event={"ID":"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f","Type":"ContainerDied","Data":"fcc96a304b12ffc267c89c3d4b3f056b4d2e01821a0ebbb16c1bdcf350072143"} Mar 11 12:36:04 crc kubenswrapper[4816]: I0311 12:36:04.010140 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553876-55nmf" Mar 11 12:36:04 crc kubenswrapper[4816]: I0311 12:36:04.182329 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l44dc\" (UniqueName: \"kubernetes.io/projected/feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f-kube-api-access-l44dc\") pod \"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f\" (UID: \"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f\") " Mar 11 12:36:04 crc kubenswrapper[4816]: I0311 12:36:04.212845 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f-kube-api-access-l44dc" (OuterVolumeSpecName: "kube-api-access-l44dc") pod "feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f" (UID: "feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f"). InnerVolumeSpecName "kube-api-access-l44dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:36:04 crc kubenswrapper[4816]: I0311 12:36:04.284713 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l44dc\" (UniqueName: \"kubernetes.io/projected/feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f-kube-api-access-l44dc\") on node \"crc\" DevicePath \"\"" Mar 11 12:36:04 crc kubenswrapper[4816]: I0311 12:36:04.749927 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553876-55nmf" event={"ID":"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f","Type":"ContainerDied","Data":"4e35de0190418478e45846aa2c23f6f00d64b785f24078cec25aba7b5c721f35"} Mar 11 12:36:04 crc kubenswrapper[4816]: I0311 12:36:04.749991 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e35de0190418478e45846aa2c23f6f00d64b785f24078cec25aba7b5c721f35" Mar 11 12:36:04 crc kubenswrapper[4816]: I0311 12:36:04.750137 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553876-55nmf" Mar 11 12:36:05 crc kubenswrapper[4816]: I0311 12:36:05.098819 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553870-5ndrc"] Mar 11 12:36:05 crc kubenswrapper[4816]: I0311 12:36:05.107602 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553870-5ndrc"] Mar 11 12:36:06 crc kubenswrapper[4816]: I0311 12:36:06.143216 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2920f68a-c5bb-474c-929b-09ced109bcc0" path="/var/lib/kubelet/pods/2920f68a-c5bb-474c-929b-09ced109bcc0/volumes" Mar 11 12:36:07 crc kubenswrapper[4816]: I0311 12:36:07.875607 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:36:07 crc kubenswrapper[4816]: I0311 12:36:07.875826 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:36:07 crc kubenswrapper[4816]: I0311 12:36:07.956061 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:36:08 crc kubenswrapper[4816]: I0311 12:36:08.846730 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:36:08 crc kubenswrapper[4816]: I0311 12:36:08.905735 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrg8t"] Mar 11 12:36:09 crc kubenswrapper[4816]: I0311 12:36:09.515852 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:36:09 crc kubenswrapper[4816]: I0311 12:36:09.516417 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:36:10 crc kubenswrapper[4816]: I0311 12:36:10.811371 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hrg8t" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="registry-server" containerID="cri-o://16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa" gracePeriod=2 Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.303898 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.419651 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-utilities\") pod \"16cec869-9798-4a51-b950-59a57dfa3c37\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.419921 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkhds\" (UniqueName: \"kubernetes.io/projected/16cec869-9798-4a51-b950-59a57dfa3c37-kube-api-access-kkhds\") pod \"16cec869-9798-4a51-b950-59a57dfa3c37\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.419975 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-catalog-content\") pod \"16cec869-9798-4a51-b950-59a57dfa3c37\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.420612 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-utilities" (OuterVolumeSpecName: "utilities") pod "16cec869-9798-4a51-b950-59a57dfa3c37" (UID: "16cec869-9798-4a51-b950-59a57dfa3c37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.427670 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16cec869-9798-4a51-b950-59a57dfa3c37-kube-api-access-kkhds" (OuterVolumeSpecName: "kube-api-access-kkhds") pod "16cec869-9798-4a51-b950-59a57dfa3c37" (UID: "16cec869-9798-4a51-b950-59a57dfa3c37"). InnerVolumeSpecName "kube-api-access-kkhds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.461374 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16cec869-9798-4a51-b950-59a57dfa3c37" (UID: "16cec869-9798-4a51-b950-59a57dfa3c37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.521862 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkhds\" (UniqueName: \"kubernetes.io/projected/16cec869-9798-4a51-b950-59a57dfa3c37-kube-api-access-kkhds\") on node \"crc\" DevicePath \"\"" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.521906 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.521919 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.822504 4816 generic.go:334] "Generic (PLEG): container finished" podID="16cec869-9798-4a51-b950-59a57dfa3c37" containerID="16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa" exitCode=0 Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.822578 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrg8t" event={"ID":"16cec869-9798-4a51-b950-59a57dfa3c37","Type":"ContainerDied","Data":"16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa"} Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.822606 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.822639 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrg8t" event={"ID":"16cec869-9798-4a51-b950-59a57dfa3c37","Type":"ContainerDied","Data":"a088adcbed3f980d1c239b585dfbe0befdd15b0a6eaae124e57dfe197c46e993"} Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.822677 4816 scope.go:117] "RemoveContainer" containerID="16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.849676 4816 scope.go:117] "RemoveContainer" containerID="0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.883798 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrg8t"] Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.884486 4816 scope.go:117] "RemoveContainer" containerID="f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.896175 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrg8t"] Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.926944 4816 scope.go:117] "RemoveContainer" containerID="16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa" Mar 11 12:36:11 crc kubenswrapper[4816]: E0311 12:36:11.927628 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa\": container with ID starting with 16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa not found: ID does not exist" containerID="16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.927671 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa"} err="failed to get container status \"16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa\": rpc error: code = NotFound desc = could not find container \"16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa\": container with ID starting with 16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa not found: ID does not exist" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.927710 4816 scope.go:117] "RemoveContainer" containerID="0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c" Mar 11 12:36:11 crc kubenswrapper[4816]: E0311 12:36:11.928277 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c\": container with ID starting with 0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c not found: ID does not exist" containerID="0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.928352 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c"} err="failed to get container status \"0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c\": rpc error: code = NotFound desc = could not find container \"0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c\": container with ID starting with 0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c not found: ID does not exist" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.928399 4816 scope.go:117] "RemoveContainer" containerID="f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0" Mar 11 12:36:11 crc kubenswrapper[4816]: E0311 12:36:11.929052 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0\": container with ID starting with f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0 not found: ID does not exist" containerID="f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.929087 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0"} err="failed to get container status \"f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0\": rpc error: code = NotFound desc = could not find container \"f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0\": container with ID starting with f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0 not found: ID does not exist" Mar 11 12:36:12 crc kubenswrapper[4816]: I0311 12:36:12.145309 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" path="/var/lib/kubelet/pods/16cec869-9798-4a51-b950-59a57dfa3c37/volumes" Mar 11 12:36:39 crc kubenswrapper[4816]: I0311 12:36:39.515902 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:36:39 crc kubenswrapper[4816]: I0311 12:36:39.516832 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:36:39 crc kubenswrapper[4816]: I0311 12:36:39.516911 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:36:39 crc kubenswrapper[4816]: I0311 12:36:39.517720 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:36:39 crc kubenswrapper[4816]: I0311 12:36:39.517821 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" gracePeriod=600 Mar 11 12:36:39 crc kubenswrapper[4816]: E0311 12:36:39.641804 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:36:40 crc kubenswrapper[4816]: I0311 12:36:40.073439 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" exitCode=0 Mar 11 12:36:40 crc kubenswrapper[4816]: I0311 12:36:40.073518 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad"} Mar 11 12:36:40 crc kubenswrapper[4816]: I0311 12:36:40.073590 4816 scope.go:117] "RemoveContainer" containerID="106e2d5f907b914dfe49698bbb91ece73a062b224d2ba46fe31a9e998555b6c9" Mar 11 12:36:40 crc kubenswrapper[4816]: I0311 12:36:40.074775 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:36:40 crc kubenswrapper[4816]: E0311 12:36:40.075158 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:36:47 crc kubenswrapper[4816]: I0311 12:36:47.685294 4816 scope.go:117] "RemoveContainer" containerID="921a40cedc2e53aafc505227f80b7caf98ac145dd8c9d234ce2c285d7eb65e19" Mar 11 12:36:54 crc kubenswrapper[4816]: I0311 12:36:54.137122 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:36:54 crc kubenswrapper[4816]: E0311 12:36:54.138189 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:37:07 crc kubenswrapper[4816]: I0311 12:37:07.130841 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:37:07 crc kubenswrapper[4816]: E0311 12:37:07.131996 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:37:21 crc kubenswrapper[4816]: I0311 12:37:21.130455 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:37:21 crc kubenswrapper[4816]: E0311 12:37:21.131557 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:37:32 crc kubenswrapper[4816]: I0311 12:37:32.130205 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:37:32 crc kubenswrapper[4816]: E0311 12:37:32.131172 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:37:45 crc kubenswrapper[4816]: I0311 12:37:45.130572 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:37:45 crc kubenswrapper[4816]: E0311 12:37:45.131618 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:37:58 crc kubenswrapper[4816]: I0311 12:37:58.130960 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:37:58 crc kubenswrapper[4816]: E0311 12:37:58.131820 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.157481 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553878-cp29z"] Mar 11 12:38:00 crc kubenswrapper[4816]: E0311 12:38:00.158291 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="registry-server" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.158311 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="registry-server" Mar 11 12:38:00 crc kubenswrapper[4816]: E0311 12:38:00.158336 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f" containerName="oc" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.158345 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f" containerName="oc" Mar 11 12:38:00 crc kubenswrapper[4816]: E0311 12:38:00.158373 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="extract-utilities" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.158383 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="extract-utilities" Mar 11 12:38:00 crc kubenswrapper[4816]: E0311 12:38:00.158405 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="extract-content" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.158415 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="extract-content" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.158615 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="registry-server" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.158631 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f" containerName="oc" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.159229 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553878-cp29z" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.164304 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.165031 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553878-cp29z"] Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.166897 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.167674 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.341659 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/080a6096-67c3-41de-97d6-29a3f80c027e-kube-api-access-h48lg\") pod \"auto-csr-approver-29553878-cp29z\" (UID: \"080a6096-67c3-41de-97d6-29a3f80c027e\") " pod="openshift-infra/auto-csr-approver-29553878-cp29z" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.443117 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/080a6096-67c3-41de-97d6-29a3f80c027e-kube-api-access-h48lg\") pod \"auto-csr-approver-29553878-cp29z\" (UID: \"080a6096-67c3-41de-97d6-29a3f80c027e\") " pod="openshift-infra/auto-csr-approver-29553878-cp29z" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.465144 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/080a6096-67c3-41de-97d6-29a3f80c027e-kube-api-access-h48lg\") pod \"auto-csr-approver-29553878-cp29z\" (UID: \"080a6096-67c3-41de-97d6-29a3f80c027e\") " pod="openshift-infra/auto-csr-approver-29553878-cp29z" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.487403 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553878-cp29z" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.919524 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553878-cp29z"] Mar 11 12:38:01 crc kubenswrapper[4816]: I0311 12:38:01.761833 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553878-cp29z" event={"ID":"080a6096-67c3-41de-97d6-29a3f80c027e","Type":"ContainerStarted","Data":"9b194b24caa1f4ba114ffb2ee3ef4ae162c9b5c6400e727b6ed6cfc030d0acea"} Mar 11 12:38:02 crc kubenswrapper[4816]: I0311 12:38:02.771140 4816 generic.go:334] "Generic (PLEG): container finished" podID="080a6096-67c3-41de-97d6-29a3f80c027e" containerID="1e58ff6aa0fcfd993b576ab14b5750187bf29f39db71f93177979cba96a1d350" exitCode=0 Mar 11 12:38:02 crc kubenswrapper[4816]: I0311 12:38:02.771353 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553878-cp29z" event={"ID":"080a6096-67c3-41de-97d6-29a3f80c027e","Type":"ContainerDied","Data":"1e58ff6aa0fcfd993b576ab14b5750187bf29f39db71f93177979cba96a1d350"} Mar 11 12:38:04 crc kubenswrapper[4816]: I0311 12:38:04.087878 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553878-cp29z" Mar 11 12:38:04 crc kubenswrapper[4816]: I0311 12:38:04.204100 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/080a6096-67c3-41de-97d6-29a3f80c027e-kube-api-access-h48lg\") pod \"080a6096-67c3-41de-97d6-29a3f80c027e\" (UID: \"080a6096-67c3-41de-97d6-29a3f80c027e\") " Mar 11 12:38:04 crc kubenswrapper[4816]: I0311 12:38:04.211375 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/080a6096-67c3-41de-97d6-29a3f80c027e-kube-api-access-h48lg" (OuterVolumeSpecName: "kube-api-access-h48lg") pod "080a6096-67c3-41de-97d6-29a3f80c027e" (UID: "080a6096-67c3-41de-97d6-29a3f80c027e"). InnerVolumeSpecName "kube-api-access-h48lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:38:04 crc kubenswrapper[4816]: I0311 12:38:04.306244 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/080a6096-67c3-41de-97d6-29a3f80c027e-kube-api-access-h48lg\") on node \"crc\" DevicePath \"\"" Mar 11 12:38:04 crc kubenswrapper[4816]: I0311 12:38:04.792716 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553878-cp29z" event={"ID":"080a6096-67c3-41de-97d6-29a3f80c027e","Type":"ContainerDied","Data":"9b194b24caa1f4ba114ffb2ee3ef4ae162c9b5c6400e727b6ed6cfc030d0acea"} Mar 11 12:38:04 crc kubenswrapper[4816]: I0311 12:38:04.792763 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b194b24caa1f4ba114ffb2ee3ef4ae162c9b5c6400e727b6ed6cfc030d0acea" Mar 11 12:38:04 crc kubenswrapper[4816]: I0311 12:38:04.793522 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553878-cp29z" Mar 11 12:38:05 crc kubenswrapper[4816]: I0311 12:38:05.190477 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553872-r7v8g"] Mar 11 12:38:05 crc kubenswrapper[4816]: I0311 12:38:05.196142 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553872-r7v8g"] Mar 11 12:38:06 crc kubenswrapper[4816]: I0311 12:38:06.140548 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ea5145c-0d08-4c85-984a-84c7e0820999" path="/var/lib/kubelet/pods/9ea5145c-0d08-4c85-984a-84c7e0820999/volumes" Mar 11 12:38:13 crc kubenswrapper[4816]: I0311 12:38:13.130699 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:38:13 crc kubenswrapper[4816]: E0311 12:38:13.131443 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:38:25 crc kubenswrapper[4816]: I0311 12:38:25.131122 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:38:25 crc kubenswrapper[4816]: E0311 12:38:25.132567 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:38:40 crc kubenswrapper[4816]: I0311 12:38:40.131638 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:38:40 crc kubenswrapper[4816]: E0311 12:38:40.133118 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:38:47 crc kubenswrapper[4816]: I0311 12:38:47.795768 4816 scope.go:117] "RemoveContainer" containerID="2595726d36e0a1d13282e231fa75cc95c6cd459575385b61b5632d40de6eac9f" Mar 11 12:38:53 crc kubenswrapper[4816]: I0311 12:38:53.131684 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:38:53 crc kubenswrapper[4816]: E0311 12:38:53.132570 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:39:07 crc kubenswrapper[4816]: I0311 12:39:07.131044 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:39:07 crc kubenswrapper[4816]: E0311 12:39:07.133490 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:39:21 crc kubenswrapper[4816]: I0311 12:39:21.132065 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:39:21 crc kubenswrapper[4816]: E0311 12:39:21.133445 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:39:36 crc kubenswrapper[4816]: I0311 12:39:36.131542 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:39:36 crc kubenswrapper[4816]: E0311 12:39:36.133237 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:39:51 crc kubenswrapper[4816]: I0311 12:39:51.131406 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:39:51 crc kubenswrapper[4816]: E0311 12:39:51.132743 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.144469 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553880-9ks6n"] Mar 11 12:40:00 crc kubenswrapper[4816]: E0311 12:40:00.145502 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080a6096-67c3-41de-97d6-29a3f80c027e" containerName="oc" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.145523 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="080a6096-67c3-41de-97d6-29a3f80c027e" containerName="oc" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.145672 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="080a6096-67c3-41de-97d6-29a3f80c027e" containerName="oc" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.146291 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553880-9ks6n" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.151477 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.151762 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.151967 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.162647 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553880-9ks6n"] Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.223714 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cgdd\" (UniqueName: \"kubernetes.io/projected/e47f911d-5bf4-4923-a6ac-95e98217fd25-kube-api-access-9cgdd\") pod \"auto-csr-approver-29553880-9ks6n\" (UID: \"e47f911d-5bf4-4923-a6ac-95e98217fd25\") " pod="openshift-infra/auto-csr-approver-29553880-9ks6n" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.324862 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cgdd\" (UniqueName: \"kubernetes.io/projected/e47f911d-5bf4-4923-a6ac-95e98217fd25-kube-api-access-9cgdd\") pod \"auto-csr-approver-29553880-9ks6n\" (UID: \"e47f911d-5bf4-4923-a6ac-95e98217fd25\") " pod="openshift-infra/auto-csr-approver-29553880-9ks6n" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.344347 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cgdd\" (UniqueName: \"kubernetes.io/projected/e47f911d-5bf4-4923-a6ac-95e98217fd25-kube-api-access-9cgdd\") pod \"auto-csr-approver-29553880-9ks6n\" (UID: \"e47f911d-5bf4-4923-a6ac-95e98217fd25\") " pod="openshift-infra/auto-csr-approver-29553880-9ks6n" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.469047 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553880-9ks6n" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.918518 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553880-9ks6n"] Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.925865 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.983961 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553880-9ks6n" event={"ID":"e47f911d-5bf4-4923-a6ac-95e98217fd25","Type":"ContainerStarted","Data":"2be0847875d3b20f4ebfa981d256af6675af1bf3c93c41747bb16a77a188363f"} Mar 11 12:40:03 crc kubenswrapper[4816]: I0311 12:40:03.000894 4816 generic.go:334] "Generic (PLEG): container finished" podID="e47f911d-5bf4-4923-a6ac-95e98217fd25" containerID="f2c8244acc6c0aed95c31a23f8089006b34d5b7db0dcdad32b1e6365dd4fd124" exitCode=0 Mar 11 12:40:03 crc kubenswrapper[4816]: I0311 12:40:03.000964 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553880-9ks6n" event={"ID":"e47f911d-5bf4-4923-a6ac-95e98217fd25","Type":"ContainerDied","Data":"f2c8244acc6c0aed95c31a23f8089006b34d5b7db0dcdad32b1e6365dd4fd124"} Mar 11 12:40:04 crc kubenswrapper[4816]: I0311 12:40:04.306115 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553880-9ks6n" Mar 11 12:40:04 crc kubenswrapper[4816]: I0311 12:40:04.393218 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cgdd\" (UniqueName: \"kubernetes.io/projected/e47f911d-5bf4-4923-a6ac-95e98217fd25-kube-api-access-9cgdd\") pod \"e47f911d-5bf4-4923-a6ac-95e98217fd25\" (UID: \"e47f911d-5bf4-4923-a6ac-95e98217fd25\") " Mar 11 12:40:04 crc kubenswrapper[4816]: I0311 12:40:04.402585 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e47f911d-5bf4-4923-a6ac-95e98217fd25-kube-api-access-9cgdd" (OuterVolumeSpecName: "kube-api-access-9cgdd") pod "e47f911d-5bf4-4923-a6ac-95e98217fd25" (UID: "e47f911d-5bf4-4923-a6ac-95e98217fd25"). InnerVolumeSpecName "kube-api-access-9cgdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:40:04 crc kubenswrapper[4816]: I0311 12:40:04.495200 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cgdd\" (UniqueName: \"kubernetes.io/projected/e47f911d-5bf4-4923-a6ac-95e98217fd25-kube-api-access-9cgdd\") on node \"crc\" DevicePath \"\"" Mar 11 12:40:05 crc kubenswrapper[4816]: I0311 12:40:05.017266 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553880-9ks6n" event={"ID":"e47f911d-5bf4-4923-a6ac-95e98217fd25","Type":"ContainerDied","Data":"2be0847875d3b20f4ebfa981d256af6675af1bf3c93c41747bb16a77a188363f"} Mar 11 12:40:05 crc kubenswrapper[4816]: I0311 12:40:05.017317 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2be0847875d3b20f4ebfa981d256af6675af1bf3c93c41747bb16a77a188363f" Mar 11 12:40:05 crc kubenswrapper[4816]: I0311 12:40:05.017359 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553880-9ks6n" Mar 11 12:40:05 crc kubenswrapper[4816]: I0311 12:40:05.131074 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:40:05 crc kubenswrapper[4816]: E0311 12:40:05.131384 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:40:05 crc kubenswrapper[4816]: I0311 12:40:05.381565 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553874-gxdm8"] Mar 11 12:40:05 crc kubenswrapper[4816]: I0311 12:40:05.387620 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553874-gxdm8"] Mar 11 12:40:06 crc kubenswrapper[4816]: I0311 12:40:06.140334 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37fd63f9-b7c1-4900-a6c1-269f771958b1" path="/var/lib/kubelet/pods/37fd63f9-b7c1-4900-a6c1-269f771958b1/volumes" Mar 11 12:40:18 crc kubenswrapper[4816]: I0311 12:40:18.130372 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:40:18 crc kubenswrapper[4816]: E0311 12:40:18.131283 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:40:29 crc kubenswrapper[4816]: I0311 12:40:29.131156 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:40:29 crc kubenswrapper[4816]: E0311 12:40:29.132292 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:40:42 crc kubenswrapper[4816]: I0311 12:40:42.131800 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:40:42 crc kubenswrapper[4816]: E0311 12:40:42.133195 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:40:47 crc kubenswrapper[4816]: I0311 12:40:47.914731 4816 scope.go:117] "RemoveContainer" containerID="372011e32b19c2edb6c819b3e68c03b814d1f5c3bd84d6d127863a08cf21f878" Mar 11 12:40:57 crc kubenswrapper[4816]: I0311 12:40:57.131999 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:40:57 crc kubenswrapper[4816]: E0311 12:40:57.133299 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:41:08 crc kubenswrapper[4816]: I0311 12:41:08.130980 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:41:08 crc kubenswrapper[4816]: E0311 12:41:08.132380 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:41:23 crc kubenswrapper[4816]: I0311 12:41:23.130282 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:41:23 crc kubenswrapper[4816]: E0311 12:41:23.131460 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:41:34 crc kubenswrapper[4816]: I0311 12:41:34.137076 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:41:34 crc kubenswrapper[4816]: E0311 12:41:34.138147 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:41:48 crc kubenswrapper[4816]: I0311 12:41:48.130551 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:41:48 crc kubenswrapper[4816]: I0311 12:41:48.964072 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"e94d54c6dd2b7a4e577e03c8b08cf5eb1a8a362732b731a0d82ddf5cdc9d6211"} Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.171021 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553882-ggqkk"] Mar 11 12:42:00 crc kubenswrapper[4816]: E0311 12:42:00.173502 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e47f911d-5bf4-4923-a6ac-95e98217fd25" containerName="oc" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.173614 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e47f911d-5bf4-4923-a6ac-95e98217fd25" containerName="oc" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.173899 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e47f911d-5bf4-4923-a6ac-95e98217fd25" containerName="oc" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.174520 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553882-ggqkk" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.178303 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.178402 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.178846 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.189920 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553882-ggqkk"] Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.335954 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp7np\" (UniqueName: \"kubernetes.io/projected/5a88e2af-5e7d-4491-a32f-75a670aed689-kube-api-access-fp7np\") pod \"auto-csr-approver-29553882-ggqkk\" (UID: \"5a88e2af-5e7d-4491-a32f-75a670aed689\") " pod="openshift-infra/auto-csr-approver-29553882-ggqkk" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.437873 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp7np\" (UniqueName: \"kubernetes.io/projected/5a88e2af-5e7d-4491-a32f-75a670aed689-kube-api-access-fp7np\") pod \"auto-csr-approver-29553882-ggqkk\" (UID: \"5a88e2af-5e7d-4491-a32f-75a670aed689\") " pod="openshift-infra/auto-csr-approver-29553882-ggqkk" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.463440 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp7np\" (UniqueName: \"kubernetes.io/projected/5a88e2af-5e7d-4491-a32f-75a670aed689-kube-api-access-fp7np\") pod \"auto-csr-approver-29553882-ggqkk\" (UID: \"5a88e2af-5e7d-4491-a32f-75a670aed689\") " pod="openshift-infra/auto-csr-approver-29553882-ggqkk" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.498507 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553882-ggqkk" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.971177 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553882-ggqkk"] Mar 11 12:42:01 crc kubenswrapper[4816]: I0311 12:42:01.076143 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553882-ggqkk" event={"ID":"5a88e2af-5e7d-4491-a32f-75a670aed689","Type":"ContainerStarted","Data":"650e6171e13f02dde6c15ca8c999cea145dc5871686afe0ccb3bed8c4e70a0a3"} Mar 11 12:42:03 crc kubenswrapper[4816]: I0311 12:42:03.093189 4816 generic.go:334] "Generic (PLEG): container finished" podID="5a88e2af-5e7d-4491-a32f-75a670aed689" containerID="c588ba0a9276d85151be0b86106d7b0f7a77bf5bc78e6ea0213f1a19b8ad671f" exitCode=0 Mar 11 12:42:03 crc kubenswrapper[4816]: I0311 12:42:03.093292 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553882-ggqkk" event={"ID":"5a88e2af-5e7d-4491-a32f-75a670aed689","Type":"ContainerDied","Data":"c588ba0a9276d85151be0b86106d7b0f7a77bf5bc78e6ea0213f1a19b8ad671f"} Mar 11 12:42:04 crc kubenswrapper[4816]: I0311 12:42:04.425310 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553882-ggqkk" Mar 11 12:42:04 crc kubenswrapper[4816]: I0311 12:42:04.543415 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp7np\" (UniqueName: \"kubernetes.io/projected/5a88e2af-5e7d-4491-a32f-75a670aed689-kube-api-access-fp7np\") pod \"5a88e2af-5e7d-4491-a32f-75a670aed689\" (UID: \"5a88e2af-5e7d-4491-a32f-75a670aed689\") " Mar 11 12:42:04 crc kubenswrapper[4816]: I0311 12:42:04.549639 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a88e2af-5e7d-4491-a32f-75a670aed689-kube-api-access-fp7np" (OuterVolumeSpecName: "kube-api-access-fp7np") pod "5a88e2af-5e7d-4491-a32f-75a670aed689" (UID: "5a88e2af-5e7d-4491-a32f-75a670aed689"). InnerVolumeSpecName "kube-api-access-fp7np". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:42:04 crc kubenswrapper[4816]: I0311 12:42:04.645112 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp7np\" (UniqueName: \"kubernetes.io/projected/5a88e2af-5e7d-4491-a32f-75a670aed689-kube-api-access-fp7np\") on node \"crc\" DevicePath \"\"" Mar 11 12:42:05 crc kubenswrapper[4816]: I0311 12:42:05.111469 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553882-ggqkk" event={"ID":"5a88e2af-5e7d-4491-a32f-75a670aed689","Type":"ContainerDied","Data":"650e6171e13f02dde6c15ca8c999cea145dc5871686afe0ccb3bed8c4e70a0a3"} Mar 11 12:42:05 crc kubenswrapper[4816]: I0311 12:42:05.111524 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="650e6171e13f02dde6c15ca8c999cea145dc5871686afe0ccb3bed8c4e70a0a3" Mar 11 12:42:05 crc kubenswrapper[4816]: I0311 12:42:05.111558 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553882-ggqkk" Mar 11 12:42:05 crc kubenswrapper[4816]: I0311 12:42:05.507344 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553876-55nmf"] Mar 11 12:42:05 crc kubenswrapper[4816]: I0311 12:42:05.515062 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553876-55nmf"] Mar 11 12:42:06 crc kubenswrapper[4816]: I0311 12:42:06.144870 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f" path="/var/lib/kubelet/pods/feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f/volumes" Mar 11 12:42:48 crc kubenswrapper[4816]: I0311 12:42:48.057109 4816 scope.go:117] "RemoveContainer" containerID="fcc96a304b12ffc267c89c3d4b3f056b4d2e01821a0ebbb16c1bdcf350072143" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.511701 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9wx28"] Mar 11 12:43:04 crc kubenswrapper[4816]: E0311 12:43:04.519668 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a88e2af-5e7d-4491-a32f-75a670aed689" containerName="oc" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.519860 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a88e2af-5e7d-4491-a32f-75a670aed689" containerName="oc" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.520841 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a88e2af-5e7d-4491-a32f-75a670aed689" containerName="oc" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.523665 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.530834 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9wx28"] Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.620443 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-catalog-content\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.620485 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-utilities\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.620543 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8nrx\" (UniqueName: \"kubernetes.io/projected/63f54d4d-3bee-42aa-82f6-3149d37d9358-kube-api-access-k8nrx\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.722195 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-catalog-content\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.722300 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-utilities\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.722402 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8nrx\" (UniqueName: \"kubernetes.io/projected/63f54d4d-3bee-42aa-82f6-3149d37d9358-kube-api-access-k8nrx\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.723035 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-catalog-content\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.723092 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-utilities\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.748631 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8nrx\" (UniqueName: \"kubernetes.io/projected/63f54d4d-3bee-42aa-82f6-3149d37d9358-kube-api-access-k8nrx\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.856782 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:05 crc kubenswrapper[4816]: I0311 12:43:05.365790 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9wx28"] Mar 11 12:43:05 crc kubenswrapper[4816]: I0311 12:43:05.574972 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wx28" event={"ID":"63f54d4d-3bee-42aa-82f6-3149d37d9358","Type":"ContainerStarted","Data":"fc8d0a8c3b36be2e38db9a330d65c8910022292f8f5383e4d2bf40ab5bbc1f7a"} Mar 11 12:43:06 crc kubenswrapper[4816]: I0311 12:43:06.584741 4816 generic.go:334] "Generic (PLEG): container finished" podID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerID="04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b" exitCode=0 Mar 11 12:43:06 crc kubenswrapper[4816]: I0311 12:43:06.585680 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wx28" event={"ID":"63f54d4d-3bee-42aa-82f6-3149d37d9358","Type":"ContainerDied","Data":"04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b"} Mar 11 12:43:07 crc kubenswrapper[4816]: I0311 12:43:07.593800 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wx28" event={"ID":"63f54d4d-3bee-42aa-82f6-3149d37d9358","Type":"ContainerStarted","Data":"268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23"} Mar 11 12:43:08 crc kubenswrapper[4816]: I0311 12:43:08.606397 4816 generic.go:334] "Generic (PLEG): container finished" podID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerID="268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23" exitCode=0 Mar 11 12:43:08 crc kubenswrapper[4816]: I0311 12:43:08.606471 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wx28" event={"ID":"63f54d4d-3bee-42aa-82f6-3149d37d9358","Type":"ContainerDied","Data":"268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23"} Mar 11 12:43:09 crc kubenswrapper[4816]: I0311 12:43:09.616221 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wx28" event={"ID":"63f54d4d-3bee-42aa-82f6-3149d37d9358","Type":"ContainerStarted","Data":"dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc"} Mar 11 12:43:09 crc kubenswrapper[4816]: I0311 12:43:09.638930 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9wx28" podStartSLOduration=3.171722704 podStartE2EDuration="5.638905089s" podCreationTimestamp="2026-03-11 12:43:04 +0000 UTC" firstStartedPulling="2026-03-11 12:43:06.587015289 +0000 UTC m=+2673.178279296" lastFinishedPulling="2026-03-11 12:43:09.054197714 +0000 UTC m=+2675.645461681" observedRunningTime="2026-03-11 12:43:09.633606818 +0000 UTC m=+2676.224870785" watchObservedRunningTime="2026-03-11 12:43:09.638905089 +0000 UTC m=+2676.230169056" Mar 11 12:43:14 crc kubenswrapper[4816]: I0311 12:43:14.857752 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:14 crc kubenswrapper[4816]: I0311 12:43:14.858691 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:15 crc kubenswrapper[4816]: I0311 12:43:15.907235 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9wx28" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="registry-server" probeResult="failure" output=< Mar 11 12:43:15 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:43:15 crc kubenswrapper[4816]: > Mar 11 12:43:24 crc kubenswrapper[4816]: I0311 12:43:24.911234 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:24 crc kubenswrapper[4816]: I0311 12:43:24.961428 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:25 crc kubenswrapper[4816]: I0311 12:43:25.162761 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9wx28"] Mar 11 12:43:26 crc kubenswrapper[4816]: I0311 12:43:26.768870 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9wx28" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="registry-server" containerID="cri-o://dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc" gracePeriod=2 Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.180778 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.211862 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-utilities\") pod \"63f54d4d-3bee-42aa-82f6-3149d37d9358\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.211951 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8nrx\" (UniqueName: \"kubernetes.io/projected/63f54d4d-3bee-42aa-82f6-3149d37d9358-kube-api-access-k8nrx\") pod \"63f54d4d-3bee-42aa-82f6-3149d37d9358\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.212020 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-catalog-content\") pod \"63f54d4d-3bee-42aa-82f6-3149d37d9358\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.213308 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-utilities" (OuterVolumeSpecName: "utilities") pod "63f54d4d-3bee-42aa-82f6-3149d37d9358" (UID: "63f54d4d-3bee-42aa-82f6-3149d37d9358"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.213832 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.223183 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63f54d4d-3bee-42aa-82f6-3149d37d9358-kube-api-access-k8nrx" (OuterVolumeSpecName: "kube-api-access-k8nrx") pod "63f54d4d-3bee-42aa-82f6-3149d37d9358" (UID: "63f54d4d-3bee-42aa-82f6-3149d37d9358"). InnerVolumeSpecName "kube-api-access-k8nrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.314865 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8nrx\" (UniqueName: \"kubernetes.io/projected/63f54d4d-3bee-42aa-82f6-3149d37d9358-kube-api-access-k8nrx\") on node \"crc\" DevicePath \"\"" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.387072 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63f54d4d-3bee-42aa-82f6-3149d37d9358" (UID: "63f54d4d-3bee-42aa-82f6-3149d37d9358"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.417080 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.780876 4816 generic.go:334] "Generic (PLEG): container finished" podID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerID="dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc" exitCode=0 Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.780932 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wx28" event={"ID":"63f54d4d-3bee-42aa-82f6-3149d37d9358","Type":"ContainerDied","Data":"dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc"} Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.780977 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wx28" event={"ID":"63f54d4d-3bee-42aa-82f6-3149d37d9358","Type":"ContainerDied","Data":"fc8d0a8c3b36be2e38db9a330d65c8910022292f8f5383e4d2bf40ab5bbc1f7a"} Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.781001 4816 scope.go:117] "RemoveContainer" containerID="dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.780992 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.804772 4816 scope.go:117] "RemoveContainer" containerID="268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.826709 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9wx28"] Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.838848 4816 scope.go:117] "RemoveContainer" containerID="04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.840343 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9wx28"] Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.866865 4816 scope.go:117] "RemoveContainer" containerID="dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc" Mar 11 12:43:27 crc kubenswrapper[4816]: E0311 12:43:27.867643 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc\": container with ID starting with dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc not found: ID does not exist" containerID="dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.867737 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc"} err="failed to get container status \"dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc\": rpc error: code = NotFound desc = could not find container \"dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc\": container with ID starting with dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc not found: ID does not exist" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.867826 4816 scope.go:117] "RemoveContainer" containerID="268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23" Mar 11 12:43:27 crc kubenswrapper[4816]: E0311 12:43:27.868396 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23\": container with ID starting with 268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23 not found: ID does not exist" containerID="268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.868443 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23"} err="failed to get container status \"268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23\": rpc error: code = NotFound desc = could not find container \"268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23\": container with ID starting with 268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23 not found: ID does not exist" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.868476 4816 scope.go:117] "RemoveContainer" containerID="04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b" Mar 11 12:43:27 crc kubenswrapper[4816]: E0311 12:43:27.868858 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b\": container with ID starting with 04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b not found: ID does not exist" containerID="04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.868905 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b"} err="failed to get container status \"04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b\": rpc error: code = NotFound desc = could not find container \"04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b\": container with ID starting with 04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b not found: ID does not exist" Mar 11 12:43:28 crc kubenswrapper[4816]: I0311 12:43:28.149868 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" path="/var/lib/kubelet/pods/63f54d4d-3bee-42aa-82f6-3149d37d9358/volumes" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.147905 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553884-t7cv9"] Mar 11 12:44:00 crc kubenswrapper[4816]: E0311 12:44:00.148895 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="registry-server" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.148914 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="registry-server" Mar 11 12:44:00 crc kubenswrapper[4816]: E0311 12:44:00.148936 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="extract-content" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.148946 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="extract-content" Mar 11 12:44:00 crc kubenswrapper[4816]: E0311 12:44:00.148954 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="extract-utilities" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.148962 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="extract-utilities" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.149165 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="registry-server" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.149778 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553884-t7cv9" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.151710 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553884-t7cv9"] Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.152062 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.152314 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.152799 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.325589 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl6k9\" (UniqueName: \"kubernetes.io/projected/a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7-kube-api-access-wl6k9\") pod \"auto-csr-approver-29553884-t7cv9\" (UID: \"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7\") " pod="openshift-infra/auto-csr-approver-29553884-t7cv9" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.427139 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl6k9\" (UniqueName: \"kubernetes.io/projected/a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7-kube-api-access-wl6k9\") pod \"auto-csr-approver-29553884-t7cv9\" (UID: \"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7\") " pod="openshift-infra/auto-csr-approver-29553884-t7cv9" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.446860 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl6k9\" (UniqueName: \"kubernetes.io/projected/a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7-kube-api-access-wl6k9\") pod \"auto-csr-approver-29553884-t7cv9\" (UID: \"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7\") " pod="openshift-infra/auto-csr-approver-29553884-t7cv9" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.475692 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553884-t7cv9" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.884026 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553884-t7cv9"] Mar 11 12:44:01 crc kubenswrapper[4816]: I0311 12:44:01.048591 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553884-t7cv9" event={"ID":"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7","Type":"ContainerStarted","Data":"e98daec571cc58b471b8d4238f5502acfec915e41235f877dcf0710ca15f8da9"} Mar 11 12:44:03 crc kubenswrapper[4816]: I0311 12:44:03.070884 4816 generic.go:334] "Generic (PLEG): container finished" podID="a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7" containerID="da680962b6fbbd0e75bc32153dd7114d5c7dd1b60db6d2fbbedf1eb60245a10a" exitCode=0 Mar 11 12:44:03 crc kubenswrapper[4816]: I0311 12:44:03.071013 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553884-t7cv9" event={"ID":"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7","Type":"ContainerDied","Data":"da680962b6fbbd0e75bc32153dd7114d5c7dd1b60db6d2fbbedf1eb60245a10a"} Mar 11 12:44:04 crc kubenswrapper[4816]: I0311 12:44:04.394933 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553884-t7cv9" Mar 11 12:44:04 crc kubenswrapper[4816]: I0311 12:44:04.486459 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl6k9\" (UniqueName: \"kubernetes.io/projected/a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7-kube-api-access-wl6k9\") pod \"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7\" (UID: \"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7\") " Mar 11 12:44:04 crc kubenswrapper[4816]: I0311 12:44:04.495241 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7-kube-api-access-wl6k9" (OuterVolumeSpecName: "kube-api-access-wl6k9") pod "a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7" (UID: "a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7"). InnerVolumeSpecName "kube-api-access-wl6k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:44:04 crc kubenswrapper[4816]: I0311 12:44:04.588282 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl6k9\" (UniqueName: \"kubernetes.io/projected/a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7-kube-api-access-wl6k9\") on node \"crc\" DevicePath \"\"" Mar 11 12:44:05 crc kubenswrapper[4816]: I0311 12:44:05.092213 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553884-t7cv9" event={"ID":"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7","Type":"ContainerDied","Data":"e98daec571cc58b471b8d4238f5502acfec915e41235f877dcf0710ca15f8da9"} Mar 11 12:44:05 crc kubenswrapper[4816]: I0311 12:44:05.092289 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e98daec571cc58b471b8d4238f5502acfec915e41235f877dcf0710ca15f8da9" Mar 11 12:44:05 crc kubenswrapper[4816]: I0311 12:44:05.092361 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553884-t7cv9" Mar 11 12:44:05 crc kubenswrapper[4816]: I0311 12:44:05.479293 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553878-cp29z"] Mar 11 12:44:05 crc kubenswrapper[4816]: I0311 12:44:05.489956 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553878-cp29z"] Mar 11 12:44:06 crc kubenswrapper[4816]: I0311 12:44:06.142877 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="080a6096-67c3-41de-97d6-29a3f80c027e" path="/var/lib/kubelet/pods/080a6096-67c3-41de-97d6-29a3f80c027e/volumes" Mar 11 12:44:09 crc kubenswrapper[4816]: I0311 12:44:09.514726 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:44:09 crc kubenswrapper[4816]: I0311 12:44:09.515207 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:44:39 crc kubenswrapper[4816]: I0311 12:44:39.515130 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:44:39 crc kubenswrapper[4816]: I0311 12:44:39.515849 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:44:48 crc kubenswrapper[4816]: I0311 12:44:48.158968 4816 scope.go:117] "RemoveContainer" containerID="1e58ff6aa0fcfd993b576ab14b5750187bf29f39db71f93177979cba96a1d350" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.147837 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9"] Mar 11 12:45:00 crc kubenswrapper[4816]: E0311 12:45:00.151030 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7" containerName="oc" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.151140 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7" containerName="oc" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.151449 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7" containerName="oc" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.152172 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.157106 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9"] Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.157449 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.157656 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.339325 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-config-volume\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.339708 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-secret-volume\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.339797 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th9lt\" (UniqueName: \"kubernetes.io/projected/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-kube-api-access-th9lt\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.441609 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-config-volume\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.441663 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-secret-volume\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.441735 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th9lt\" (UniqueName: \"kubernetes.io/projected/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-kube-api-access-th9lt\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.443207 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-config-volume\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.456851 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-secret-volume\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.459042 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th9lt\" (UniqueName: \"kubernetes.io/projected/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-kube-api-access-th9lt\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.474055 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.678309 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9"] Mar 11 12:45:01 crc kubenswrapper[4816]: I0311 12:45:01.582671 4816 generic.go:334] "Generic (PLEG): container finished" podID="b052f4a1-1fdd-4c5e-ba2d-0806387d6058" containerID="c9818458ccc0df3dcfe2c0eba17ad47fa9cb27149aa729ec7d3b86ff29db078f" exitCode=0 Mar 11 12:45:01 crc kubenswrapper[4816]: I0311 12:45:01.582848 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" event={"ID":"b052f4a1-1fdd-4c5e-ba2d-0806387d6058","Type":"ContainerDied","Data":"c9818458ccc0df3dcfe2c0eba17ad47fa9cb27149aa729ec7d3b86ff29db078f"} Mar 11 12:45:01 crc kubenswrapper[4816]: I0311 12:45:01.582897 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" event={"ID":"b052f4a1-1fdd-4c5e-ba2d-0806387d6058","Type":"ContainerStarted","Data":"2c1af77f22809780fc23335dd2c2c3a4aeb0504ebb4cd7f236c74c8f7fa7f060"} Mar 11 12:45:02 crc kubenswrapper[4816]: I0311 12:45:02.883244 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.084194 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-config-volume\") pod \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.084425 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-secret-volume\") pod \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.084601 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th9lt\" (UniqueName: \"kubernetes.io/projected/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-kube-api-access-th9lt\") pod \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.086305 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-config-volume" (OuterVolumeSpecName: "config-volume") pod "b052f4a1-1fdd-4c5e-ba2d-0806387d6058" (UID: "b052f4a1-1fdd-4c5e-ba2d-0806387d6058"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.091563 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b052f4a1-1fdd-4c5e-ba2d-0806387d6058" (UID: "b052f4a1-1fdd-4c5e-ba2d-0806387d6058"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.091583 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-kube-api-access-th9lt" (OuterVolumeSpecName: "kube-api-access-th9lt") pod "b052f4a1-1fdd-4c5e-ba2d-0806387d6058" (UID: "b052f4a1-1fdd-4c5e-ba2d-0806387d6058"). InnerVolumeSpecName "kube-api-access-th9lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.187842 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.187887 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.187908 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th9lt\" (UniqueName: \"kubernetes.io/projected/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-kube-api-access-th9lt\") on node \"crc\" DevicePath \"\"" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.603083 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" event={"ID":"b052f4a1-1fdd-4c5e-ba2d-0806387d6058","Type":"ContainerDied","Data":"2c1af77f22809780fc23335dd2c2c3a4aeb0504ebb4cd7f236c74c8f7fa7f060"} Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.603588 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c1af77f22809780fc23335dd2c2c3a4aeb0504ebb4cd7f236c74c8f7fa7f060" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.603187 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.959621 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52"] Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.964532 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52"] Mar 11 12:45:04 crc kubenswrapper[4816]: I0311 12:45:04.144792 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c040a86-9614-48cb-9df7-14c83b046dce" path="/var/lib/kubelet/pods/3c040a86-9614-48cb-9df7-14c83b046dce/volumes" Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.515208 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.515648 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.515705 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.516307 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e94d54c6dd2b7a4e577e03c8b08cf5eb1a8a362732b731a0d82ddf5cdc9d6211"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.516389 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://e94d54c6dd2b7a4e577e03c8b08cf5eb1a8a362732b731a0d82ddf5cdc9d6211" gracePeriod=600 Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.663729 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="e94d54c6dd2b7a4e577e03c8b08cf5eb1a8a362732b731a0d82ddf5cdc9d6211" exitCode=0 Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.664172 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"e94d54c6dd2b7a4e577e03c8b08cf5eb1a8a362732b731a0d82ddf5cdc9d6211"} Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.664226 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:45:10 crc kubenswrapper[4816]: I0311 12:45:10.674213 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334"} Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.813888 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lbw7s"] Mar 11 12:45:12 crc kubenswrapper[4816]: E0311 12:45:12.814864 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b052f4a1-1fdd-4c5e-ba2d-0806387d6058" containerName="collect-profiles" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.814883 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b052f4a1-1fdd-4c5e-ba2d-0806387d6058" containerName="collect-profiles" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.815094 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b052f4a1-1fdd-4c5e-ba2d-0806387d6058" containerName="collect-profiles" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.816210 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.825587 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lbw7s"] Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.852332 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-catalog-content\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.852419 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nprs\" (UniqueName: \"kubernetes.io/projected/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-kube-api-access-9nprs\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.852481 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-utilities\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.954901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-utilities\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.955062 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-catalog-content\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.955090 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nprs\" (UniqueName: \"kubernetes.io/projected/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-kube-api-access-9nprs\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.955661 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-utilities\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.955933 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-catalog-content\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.983294 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nprs\" (UniqueName: \"kubernetes.io/projected/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-kube-api-access-9nprs\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:13 crc kubenswrapper[4816]: I0311 12:45:13.149838 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:13 crc kubenswrapper[4816]: I0311 12:45:13.459564 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lbw7s"] Mar 11 12:45:13 crc kubenswrapper[4816]: I0311 12:45:13.704065 4816 generic.go:334] "Generic (PLEG): container finished" podID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerID="cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2" exitCode=0 Mar 11 12:45:13 crc kubenswrapper[4816]: I0311 12:45:13.704160 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lbw7s" event={"ID":"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff","Type":"ContainerDied","Data":"cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2"} Mar 11 12:45:13 crc kubenswrapper[4816]: I0311 12:45:13.704512 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lbw7s" event={"ID":"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff","Type":"ContainerStarted","Data":"77560a17df15c39d5c9ac619eeb26bfb46708b2b82cc785833190cee1e64b1cb"} Mar 11 12:45:13 crc kubenswrapper[4816]: I0311 12:45:13.706304 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:45:14 crc kubenswrapper[4816]: I0311 12:45:14.718554 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lbw7s" event={"ID":"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff","Type":"ContainerStarted","Data":"798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f"} Mar 11 12:45:15 crc kubenswrapper[4816]: I0311 12:45:15.729239 4816 generic.go:334] "Generic (PLEG): container finished" podID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerID="798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f" exitCode=0 Mar 11 12:45:15 crc kubenswrapper[4816]: I0311 12:45:15.729376 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lbw7s" event={"ID":"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff","Type":"ContainerDied","Data":"798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f"} Mar 11 12:45:16 crc kubenswrapper[4816]: I0311 12:45:16.745865 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lbw7s" event={"ID":"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff","Type":"ContainerStarted","Data":"598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620"} Mar 11 12:45:16 crc kubenswrapper[4816]: I0311 12:45:16.770578 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lbw7s" podStartSLOduration=2.178939252 podStartE2EDuration="4.770551701s" podCreationTimestamp="2026-03-11 12:45:12 +0000 UTC" firstStartedPulling="2026-03-11 12:45:13.706007308 +0000 UTC m=+2800.297271275" lastFinishedPulling="2026-03-11 12:45:16.297619707 +0000 UTC m=+2802.888883724" observedRunningTime="2026-03-11 12:45:16.765546798 +0000 UTC m=+2803.356810805" watchObservedRunningTime="2026-03-11 12:45:16.770551701 +0000 UTC m=+2803.361815678" Mar 11 12:45:23 crc kubenswrapper[4816]: I0311 12:45:23.150794 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:23 crc kubenswrapper[4816]: I0311 12:45:23.151633 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:23 crc kubenswrapper[4816]: I0311 12:45:23.209545 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:23 crc kubenswrapper[4816]: I0311 12:45:23.886736 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:23 crc kubenswrapper[4816]: I0311 12:45:23.979536 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lbw7s"] Mar 11 12:45:25 crc kubenswrapper[4816]: I0311 12:45:25.836144 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lbw7s" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="registry-server" containerID="cri-o://598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620" gracePeriod=2 Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.290766 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.389690 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-utilities\") pod \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.389829 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-catalog-content\") pod \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.389882 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nprs\" (UniqueName: \"kubernetes.io/projected/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-kube-api-access-9nprs\") pod \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.390920 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-utilities" (OuterVolumeSpecName: "utilities") pod "ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" (UID: "ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.398554 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-kube-api-access-9nprs" (OuterVolumeSpecName: "kube-api-access-9nprs") pod "ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" (UID: "ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff"). InnerVolumeSpecName "kube-api-access-9nprs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.466371 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" (UID: "ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.491657 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.491698 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.491716 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nprs\" (UniqueName: \"kubernetes.io/projected/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-kube-api-access-9nprs\") on node \"crc\" DevicePath \"\"" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.851510 4816 generic.go:334] "Generic (PLEG): container finished" podID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerID="598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620" exitCode=0 Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.851565 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lbw7s" event={"ID":"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff","Type":"ContainerDied","Data":"598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620"} Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.851609 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lbw7s" event={"ID":"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff","Type":"ContainerDied","Data":"77560a17df15c39d5c9ac619eeb26bfb46708b2b82cc785833190cee1e64b1cb"} Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.851633 4816 scope.go:117] "RemoveContainer" containerID="598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.851631 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.873949 4816 scope.go:117] "RemoveContainer" containerID="798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.892301 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lbw7s"] Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.899098 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lbw7s"] Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.906923 4816 scope.go:117] "RemoveContainer" containerID="cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.942328 4816 scope.go:117] "RemoveContainer" containerID="598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620" Mar 11 12:45:26 crc kubenswrapper[4816]: E0311 12:45:26.942894 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620\": container with ID starting with 598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620 not found: ID does not exist" containerID="598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.942943 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620"} err="failed to get container status \"598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620\": rpc error: code = NotFound desc = could not find container \"598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620\": container with ID starting with 598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620 not found: ID does not exist" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.942976 4816 scope.go:117] "RemoveContainer" containerID="798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f" Mar 11 12:45:26 crc kubenswrapper[4816]: E0311 12:45:26.943312 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f\": container with ID starting with 798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f not found: ID does not exist" containerID="798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.943348 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f"} err="failed to get container status \"798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f\": rpc error: code = NotFound desc = could not find container \"798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f\": container with ID starting with 798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f not found: ID does not exist" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.943365 4816 scope.go:117] "RemoveContainer" containerID="cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2" Mar 11 12:45:26 crc kubenswrapper[4816]: E0311 12:45:26.943631 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2\": container with ID starting with cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2 not found: ID does not exist" containerID="cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.943661 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2"} err="failed to get container status \"cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2\": rpc error: code = NotFound desc = could not find container \"cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2\": container with ID starting with cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2 not found: ID does not exist" Mar 11 12:45:28 crc kubenswrapper[4816]: I0311 12:45:28.144588 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" path="/var/lib/kubelet/pods/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff/volumes" Mar 11 12:45:48 crc kubenswrapper[4816]: I0311 12:45:48.247427 4816 scope.go:117] "RemoveContainer" containerID="f3bda5d4e49a815a926b2f32c60f3932a76a7181a017078bc20f79926bfbf6a6" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.153836 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553886-xjtpg"] Mar 11 12:46:00 crc kubenswrapper[4816]: E0311 12:46:00.155521 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="extract-utilities" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.155539 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="extract-utilities" Mar 11 12:46:00 crc kubenswrapper[4816]: E0311 12:46:00.155554 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="extract-content" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.155564 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="extract-content" Mar 11 12:46:00 crc kubenswrapper[4816]: E0311 12:46:00.155576 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="registry-server" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.155582 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="registry-server" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.155734 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="registry-server" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.156266 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.158438 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.160775 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.163323 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553886-xjtpg"] Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.164066 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.305947 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx444\" (UniqueName: \"kubernetes.io/projected/10793112-fba0-46e4-a3a5-201255a72221-kube-api-access-sx444\") pod \"auto-csr-approver-29553886-xjtpg\" (UID: \"10793112-fba0-46e4-a3a5-201255a72221\") " pod="openshift-infra/auto-csr-approver-29553886-xjtpg" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.409182 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx444\" (UniqueName: \"kubernetes.io/projected/10793112-fba0-46e4-a3a5-201255a72221-kube-api-access-sx444\") pod \"auto-csr-approver-29553886-xjtpg\" (UID: \"10793112-fba0-46e4-a3a5-201255a72221\") " pod="openshift-infra/auto-csr-approver-29553886-xjtpg" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.444076 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx444\" (UniqueName: \"kubernetes.io/projected/10793112-fba0-46e4-a3a5-201255a72221-kube-api-access-sx444\") pod \"auto-csr-approver-29553886-xjtpg\" (UID: \"10793112-fba0-46e4-a3a5-201255a72221\") " pod="openshift-infra/auto-csr-approver-29553886-xjtpg" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.481617 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.735780 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553886-xjtpg"] Mar 11 12:46:01 crc kubenswrapper[4816]: I0311 12:46:01.155684 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" event={"ID":"10793112-fba0-46e4-a3a5-201255a72221","Type":"ContainerStarted","Data":"9a1ea2f5c8b9c9d2168bc35813243f641fb2fd4e1b93f51a50c3ed629bd728cd"} Mar 11 12:46:02 crc kubenswrapper[4816]: I0311 12:46:02.170046 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" event={"ID":"10793112-fba0-46e4-a3a5-201255a72221","Type":"ContainerStarted","Data":"015da5072b60f8b74ef45c6695076cb6f089d3139f01fe3ef7f7d86d8236f381"} Mar 11 12:46:02 crc kubenswrapper[4816]: I0311 12:46:02.193202 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" podStartSLOduration=1.2703352749999999 podStartE2EDuration="2.193172663s" podCreationTimestamp="2026-03-11 12:46:00 +0000 UTC" firstStartedPulling="2026-03-11 12:46:00.744325721 +0000 UTC m=+2847.335589688" lastFinishedPulling="2026-03-11 12:46:01.667163109 +0000 UTC m=+2848.258427076" observedRunningTime="2026-03-11 12:46:02.186568114 +0000 UTC m=+2848.777832101" watchObservedRunningTime="2026-03-11 12:46:02.193172663 +0000 UTC m=+2848.784436650" Mar 11 12:46:03 crc kubenswrapper[4816]: I0311 12:46:03.182040 4816 generic.go:334] "Generic (PLEG): container finished" podID="10793112-fba0-46e4-a3a5-201255a72221" containerID="015da5072b60f8b74ef45c6695076cb6f089d3139f01fe3ef7f7d86d8236f381" exitCode=0 Mar 11 12:46:03 crc kubenswrapper[4816]: I0311 12:46:03.182128 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" event={"ID":"10793112-fba0-46e4-a3a5-201255a72221","Type":"ContainerDied","Data":"015da5072b60f8b74ef45c6695076cb6f089d3139f01fe3ef7f7d86d8236f381"} Mar 11 12:46:04 crc kubenswrapper[4816]: I0311 12:46:04.524708 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" Mar 11 12:46:04 crc kubenswrapper[4816]: I0311 12:46:04.683641 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx444\" (UniqueName: \"kubernetes.io/projected/10793112-fba0-46e4-a3a5-201255a72221-kube-api-access-sx444\") pod \"10793112-fba0-46e4-a3a5-201255a72221\" (UID: \"10793112-fba0-46e4-a3a5-201255a72221\") " Mar 11 12:46:04 crc kubenswrapper[4816]: I0311 12:46:04.693793 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10793112-fba0-46e4-a3a5-201255a72221-kube-api-access-sx444" (OuterVolumeSpecName: "kube-api-access-sx444") pod "10793112-fba0-46e4-a3a5-201255a72221" (UID: "10793112-fba0-46e4-a3a5-201255a72221"). InnerVolumeSpecName "kube-api-access-sx444". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:46:04 crc kubenswrapper[4816]: I0311 12:46:04.785478 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx444\" (UniqueName: \"kubernetes.io/projected/10793112-fba0-46e4-a3a5-201255a72221-kube-api-access-sx444\") on node \"crc\" DevicePath \"\"" Mar 11 12:46:05 crc kubenswrapper[4816]: I0311 12:46:05.202163 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" event={"ID":"10793112-fba0-46e4-a3a5-201255a72221","Type":"ContainerDied","Data":"9a1ea2f5c8b9c9d2168bc35813243f641fb2fd4e1b93f51a50c3ed629bd728cd"} Mar 11 12:46:05 crc kubenswrapper[4816]: I0311 12:46:05.202215 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a1ea2f5c8b9c9d2168bc35813243f641fb2fd4e1b93f51a50c3ed629bd728cd" Mar 11 12:46:05 crc kubenswrapper[4816]: I0311 12:46:05.202332 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" Mar 11 12:46:05 crc kubenswrapper[4816]: I0311 12:46:05.275112 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553880-9ks6n"] Mar 11 12:46:05 crc kubenswrapper[4816]: I0311 12:46:05.282387 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553880-9ks6n"] Mar 11 12:46:06 crc kubenswrapper[4816]: I0311 12:46:06.143686 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e47f911d-5bf4-4923-a6ac-95e98217fd25" path="/var/lib/kubelet/pods/e47f911d-5bf4-4923-a6ac-95e98217fd25/volumes" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.476941 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4zzg5"] Mar 11 12:46:35 crc kubenswrapper[4816]: E0311 12:46:35.477875 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10793112-fba0-46e4-a3a5-201255a72221" containerName="oc" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.477887 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="10793112-fba0-46e4-a3a5-201255a72221" containerName="oc" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.478090 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="10793112-fba0-46e4-a3a5-201255a72221" containerName="oc" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.479126 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.498492 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zzg5"] Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.564221 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-utilities\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.564322 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-catalog-content\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.564382 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjbkf\" (UniqueName: \"kubernetes.io/projected/0d271092-56ec-48d0-91cc-8aab4b87d282-kube-api-access-pjbkf\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.665320 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-catalog-content\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.665418 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjbkf\" (UniqueName: \"kubernetes.io/projected/0d271092-56ec-48d0-91cc-8aab4b87d282-kube-api-access-pjbkf\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.665470 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-utilities\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.666371 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-utilities\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.666427 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-catalog-content\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.689808 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjbkf\" (UniqueName: \"kubernetes.io/projected/0d271092-56ec-48d0-91cc-8aab4b87d282-kube-api-access-pjbkf\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.797435 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:36 crc kubenswrapper[4816]: I0311 12:46:36.241929 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zzg5"] Mar 11 12:46:36 crc kubenswrapper[4816]: I0311 12:46:36.454258 4816 generic.go:334] "Generic (PLEG): container finished" podID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerID="f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b" exitCode=0 Mar 11 12:46:36 crc kubenswrapper[4816]: I0311 12:46:36.454534 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zzg5" event={"ID":"0d271092-56ec-48d0-91cc-8aab4b87d282","Type":"ContainerDied","Data":"f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b"} Mar 11 12:46:36 crc kubenswrapper[4816]: I0311 12:46:36.454575 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zzg5" event={"ID":"0d271092-56ec-48d0-91cc-8aab4b87d282","Type":"ContainerStarted","Data":"d12981d070039e7ac78396e563a248bc2a041f1a7cbd0cdc1c1468d7a68e045e"} Mar 11 12:46:38 crc kubenswrapper[4816]: I0311 12:46:38.473069 4816 generic.go:334] "Generic (PLEG): container finished" podID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerID="45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6" exitCode=0 Mar 11 12:46:38 crc kubenswrapper[4816]: I0311 12:46:38.473133 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zzg5" event={"ID":"0d271092-56ec-48d0-91cc-8aab4b87d282","Type":"ContainerDied","Data":"45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6"} Mar 11 12:46:39 crc kubenswrapper[4816]: I0311 12:46:39.483849 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zzg5" event={"ID":"0d271092-56ec-48d0-91cc-8aab4b87d282","Type":"ContainerStarted","Data":"e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37"} Mar 11 12:46:39 crc kubenswrapper[4816]: I0311 12:46:39.506896 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4zzg5" podStartSLOduration=1.973589633 podStartE2EDuration="4.506861611s" podCreationTimestamp="2026-03-11 12:46:35 +0000 UTC" firstStartedPulling="2026-03-11 12:46:36.47103144 +0000 UTC m=+2883.062295397" lastFinishedPulling="2026-03-11 12:46:39.004303408 +0000 UTC m=+2885.595567375" observedRunningTime="2026-03-11 12:46:39.500676774 +0000 UTC m=+2886.091940741" watchObservedRunningTime="2026-03-11 12:46:39.506861611 +0000 UTC m=+2886.098125578" Mar 11 12:46:45 crc kubenswrapper[4816]: I0311 12:46:45.798463 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:45 crc kubenswrapper[4816]: I0311 12:46:45.799144 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:45 crc kubenswrapper[4816]: I0311 12:46:45.841087 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:46 crc kubenswrapper[4816]: I0311 12:46:46.610574 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:46 crc kubenswrapper[4816]: I0311 12:46:46.670628 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zzg5"] Mar 11 12:46:48 crc kubenswrapper[4816]: I0311 12:46:48.322748 4816 scope.go:117] "RemoveContainer" containerID="f2c8244acc6c0aed95c31a23f8089006b34d5b7db0dcdad32b1e6365dd4fd124" Mar 11 12:46:48 crc kubenswrapper[4816]: I0311 12:46:48.557390 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4zzg5" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="registry-server" containerID="cri-o://e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37" gracePeriod=2 Mar 11 12:46:48 crc kubenswrapper[4816]: I0311 12:46:48.956367 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.064809 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-catalog-content\") pod \"0d271092-56ec-48d0-91cc-8aab4b87d282\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.064933 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjbkf\" (UniqueName: \"kubernetes.io/projected/0d271092-56ec-48d0-91cc-8aab4b87d282-kube-api-access-pjbkf\") pod \"0d271092-56ec-48d0-91cc-8aab4b87d282\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.065076 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-utilities\") pod \"0d271092-56ec-48d0-91cc-8aab4b87d282\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.066213 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-utilities" (OuterVolumeSpecName: "utilities") pod "0d271092-56ec-48d0-91cc-8aab4b87d282" (UID: "0d271092-56ec-48d0-91cc-8aab4b87d282"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.075705 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d271092-56ec-48d0-91cc-8aab4b87d282-kube-api-access-pjbkf" (OuterVolumeSpecName: "kube-api-access-pjbkf") pod "0d271092-56ec-48d0-91cc-8aab4b87d282" (UID: "0d271092-56ec-48d0-91cc-8aab4b87d282"). InnerVolumeSpecName "kube-api-access-pjbkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.093014 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d271092-56ec-48d0-91cc-8aab4b87d282" (UID: "0d271092-56ec-48d0-91cc-8aab4b87d282"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.166837 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjbkf\" (UniqueName: \"kubernetes.io/projected/0d271092-56ec-48d0-91cc-8aab4b87d282-kube-api-access-pjbkf\") on node \"crc\" DevicePath \"\"" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.167186 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.167274 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.566975 4816 generic.go:334] "Generic (PLEG): container finished" podID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerID="e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37" exitCode=0 Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.567039 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zzg5" event={"ID":"0d271092-56ec-48d0-91cc-8aab4b87d282","Type":"ContainerDied","Data":"e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37"} Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.567072 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zzg5" event={"ID":"0d271092-56ec-48d0-91cc-8aab4b87d282","Type":"ContainerDied","Data":"d12981d070039e7ac78396e563a248bc2a041f1a7cbd0cdc1c1468d7a68e045e"} Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.567095 4816 scope.go:117] "RemoveContainer" containerID="e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.567299 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.585290 4816 scope.go:117] "RemoveContainer" containerID="45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.598951 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zzg5"] Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.606127 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zzg5"] Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.626067 4816 scope.go:117] "RemoveContainer" containerID="f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.647530 4816 scope.go:117] "RemoveContainer" containerID="e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37" Mar 11 12:46:49 crc kubenswrapper[4816]: E0311 12:46:49.648053 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37\": container with ID starting with e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37 not found: ID does not exist" containerID="e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.648082 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37"} err="failed to get container status \"e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37\": rpc error: code = NotFound desc = could not find container \"e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37\": container with ID starting with e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37 not found: ID does not exist" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.648118 4816 scope.go:117] "RemoveContainer" containerID="45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6" Mar 11 12:46:49 crc kubenswrapper[4816]: E0311 12:46:49.648530 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6\": container with ID starting with 45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6 not found: ID does not exist" containerID="45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.648547 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6"} err="failed to get container status \"45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6\": rpc error: code = NotFound desc = could not find container \"45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6\": container with ID starting with 45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6 not found: ID does not exist" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.648560 4816 scope.go:117] "RemoveContainer" containerID="f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b" Mar 11 12:46:49 crc kubenswrapper[4816]: E0311 12:46:49.648821 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b\": container with ID starting with f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b not found: ID does not exist" containerID="f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.648839 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b"} err="failed to get container status \"f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b\": rpc error: code = NotFound desc = could not find container \"f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b\": container with ID starting with f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b not found: ID does not exist" Mar 11 12:46:50 crc kubenswrapper[4816]: I0311 12:46:50.146503 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" path="/var/lib/kubelet/pods/0d271092-56ec-48d0-91cc-8aab4b87d282/volumes" Mar 11 12:47:09 crc kubenswrapper[4816]: I0311 12:47:09.515582 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:47:09 crc kubenswrapper[4816]: I0311 12:47:09.517621 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:47:39 crc kubenswrapper[4816]: I0311 12:47:39.514717 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:47:39 crc kubenswrapper[4816]: I0311 12:47:39.515382 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.153235 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553888-7sw8h"] Mar 11 12:48:00 crc kubenswrapper[4816]: E0311 12:48:00.154223 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="extract-content" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.154270 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="extract-content" Mar 11 12:48:00 crc kubenswrapper[4816]: E0311 12:48:00.154305 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="extract-utilities" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.154318 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="extract-utilities" Mar 11 12:48:00 crc kubenswrapper[4816]: E0311 12:48:00.154358 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="registry-server" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.154372 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="registry-server" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.154582 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="registry-server" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.155379 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553888-7sw8h" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.160858 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.160995 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.161147 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.161995 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553888-7sw8h"] Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.289401 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7fcn\" (UniqueName: \"kubernetes.io/projected/f3a5e0fe-c52b-4b6f-ab13-ba73fce64177-kube-api-access-h7fcn\") pod \"auto-csr-approver-29553888-7sw8h\" (UID: \"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177\") " pod="openshift-infra/auto-csr-approver-29553888-7sw8h" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.391540 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7fcn\" (UniqueName: \"kubernetes.io/projected/f3a5e0fe-c52b-4b6f-ab13-ba73fce64177-kube-api-access-h7fcn\") pod \"auto-csr-approver-29553888-7sw8h\" (UID: \"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177\") " pod="openshift-infra/auto-csr-approver-29553888-7sw8h" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.410589 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7fcn\" (UniqueName: \"kubernetes.io/projected/f3a5e0fe-c52b-4b6f-ab13-ba73fce64177-kube-api-access-h7fcn\") pod \"auto-csr-approver-29553888-7sw8h\" (UID: \"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177\") " pod="openshift-infra/auto-csr-approver-29553888-7sw8h" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.484572 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553888-7sw8h" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.954395 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553888-7sw8h"] Mar 11 12:48:01 crc kubenswrapper[4816]: I0311 12:48:01.225508 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553888-7sw8h" event={"ID":"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177","Type":"ContainerStarted","Data":"37e212f532f9285ddc8c1a39daa606e585f77369c6470ebdc810b0fab9c9bb7c"} Mar 11 12:48:03 crc kubenswrapper[4816]: I0311 12:48:03.246901 4816 generic.go:334] "Generic (PLEG): container finished" podID="f3a5e0fe-c52b-4b6f-ab13-ba73fce64177" containerID="9ed1a5e43552ff0476bd301f6f56de7c0e4f936f582bd894ea6e5569ba2db74d" exitCode=0 Mar 11 12:48:03 crc kubenswrapper[4816]: I0311 12:48:03.246996 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553888-7sw8h" event={"ID":"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177","Type":"ContainerDied","Data":"9ed1a5e43552ff0476bd301f6f56de7c0e4f936f582bd894ea6e5569ba2db74d"} Mar 11 12:48:04 crc kubenswrapper[4816]: I0311 12:48:04.635336 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553888-7sw8h" Mar 11 12:48:04 crc kubenswrapper[4816]: I0311 12:48:04.698809 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7fcn\" (UniqueName: \"kubernetes.io/projected/f3a5e0fe-c52b-4b6f-ab13-ba73fce64177-kube-api-access-h7fcn\") pod \"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177\" (UID: \"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177\") " Mar 11 12:48:04 crc kubenswrapper[4816]: I0311 12:48:04.707870 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a5e0fe-c52b-4b6f-ab13-ba73fce64177-kube-api-access-h7fcn" (OuterVolumeSpecName: "kube-api-access-h7fcn") pod "f3a5e0fe-c52b-4b6f-ab13-ba73fce64177" (UID: "f3a5e0fe-c52b-4b6f-ab13-ba73fce64177"). InnerVolumeSpecName "kube-api-access-h7fcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:48:04 crc kubenswrapper[4816]: I0311 12:48:04.801230 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7fcn\" (UniqueName: \"kubernetes.io/projected/f3a5e0fe-c52b-4b6f-ab13-ba73fce64177-kube-api-access-h7fcn\") on node \"crc\" DevicePath \"\"" Mar 11 12:48:05 crc kubenswrapper[4816]: I0311 12:48:05.272746 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553888-7sw8h" Mar 11 12:48:05 crc kubenswrapper[4816]: I0311 12:48:05.272686 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553888-7sw8h" event={"ID":"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177","Type":"ContainerDied","Data":"37e212f532f9285ddc8c1a39daa606e585f77369c6470ebdc810b0fab9c9bb7c"} Mar 11 12:48:05 crc kubenswrapper[4816]: I0311 12:48:05.273031 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37e212f532f9285ddc8c1a39daa606e585f77369c6470ebdc810b0fab9c9bb7c" Mar 11 12:48:05 crc kubenswrapper[4816]: I0311 12:48:05.728740 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553882-ggqkk"] Mar 11 12:48:05 crc kubenswrapper[4816]: I0311 12:48:05.737078 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553882-ggqkk"] Mar 11 12:48:06 crc kubenswrapper[4816]: I0311 12:48:06.140727 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a88e2af-5e7d-4491-a32f-75a670aed689" path="/var/lib/kubelet/pods/5a88e2af-5e7d-4491-a32f-75a670aed689/volumes" Mar 11 12:48:09 crc kubenswrapper[4816]: I0311 12:48:09.515895 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:48:09 crc kubenswrapper[4816]: I0311 12:48:09.516587 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:48:09 crc kubenswrapper[4816]: I0311 12:48:09.516656 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:48:09 crc kubenswrapper[4816]: I0311 12:48:09.517567 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:48:09 crc kubenswrapper[4816]: I0311 12:48:09.517649 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" gracePeriod=600 Mar 11 12:48:09 crc kubenswrapper[4816]: E0311 12:48:09.643213 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:48:10 crc kubenswrapper[4816]: I0311 12:48:10.331642 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" exitCode=0 Mar 11 12:48:10 crc kubenswrapper[4816]: I0311 12:48:10.331723 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334"} Mar 11 12:48:10 crc kubenswrapper[4816]: I0311 12:48:10.331814 4816 scope.go:117] "RemoveContainer" containerID="e94d54c6dd2b7a4e577e03c8b08cf5eb1a8a362732b731a0d82ddf5cdc9d6211" Mar 11 12:48:10 crc kubenswrapper[4816]: I0311 12:48:10.332724 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:48:10 crc kubenswrapper[4816]: E0311 12:48:10.333297 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:48:21 crc kubenswrapper[4816]: I0311 12:48:21.131174 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:48:21 crc kubenswrapper[4816]: E0311 12:48:21.132349 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:48:32 crc kubenswrapper[4816]: I0311 12:48:32.131224 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:48:32 crc kubenswrapper[4816]: E0311 12:48:32.132290 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:48:44 crc kubenswrapper[4816]: I0311 12:48:44.143330 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:48:44 crc kubenswrapper[4816]: E0311 12:48:44.144369 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.650354 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-82vz9"] Mar 11 12:48:46 crc kubenswrapper[4816]: E0311 12:48:46.650969 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a5e0fe-c52b-4b6f-ab13-ba73fce64177" containerName="oc" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.651006 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a5e0fe-c52b-4b6f-ab13-ba73fce64177" containerName="oc" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.651400 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a5e0fe-c52b-4b6f-ab13-ba73fce64177" containerName="oc" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.654655 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.667583 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-82vz9"] Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.833084 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b81c3bf-499d-48bd-869b-671fefa1ba81-utilities\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.833188 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b81c3bf-499d-48bd-869b-671fefa1ba81-catalog-content\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.833219 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgptm\" (UniqueName: \"kubernetes.io/projected/2b81c3bf-499d-48bd-869b-671fefa1ba81-kube-api-access-dgptm\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.934497 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b81c3bf-499d-48bd-869b-671fefa1ba81-catalog-content\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.934877 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgptm\" (UniqueName: \"kubernetes.io/projected/2b81c3bf-499d-48bd-869b-671fefa1ba81-kube-api-access-dgptm\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.934965 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b81c3bf-499d-48bd-869b-671fefa1ba81-utilities\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.935916 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b81c3bf-499d-48bd-869b-671fefa1ba81-catalog-content\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.935972 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b81c3bf-499d-48bd-869b-671fefa1ba81-utilities\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.962649 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgptm\" (UniqueName: \"kubernetes.io/projected/2b81c3bf-499d-48bd-869b-671fefa1ba81-kube-api-access-dgptm\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.986186 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:47 crc kubenswrapper[4816]: I0311 12:48:47.280354 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-82vz9"] Mar 11 12:48:47 crc kubenswrapper[4816]: I0311 12:48:47.664901 4816 generic.go:334] "Generic (PLEG): container finished" podID="2b81c3bf-499d-48bd-869b-671fefa1ba81" containerID="ed8a3257cbd0ffba17d5ca43261f294047c9f3f8158d95c91ccd21053221242c" exitCode=0 Mar 11 12:48:47 crc kubenswrapper[4816]: I0311 12:48:47.664974 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82vz9" event={"ID":"2b81c3bf-499d-48bd-869b-671fefa1ba81","Type":"ContainerDied","Data":"ed8a3257cbd0ffba17d5ca43261f294047c9f3f8158d95c91ccd21053221242c"} Mar 11 12:48:47 crc kubenswrapper[4816]: I0311 12:48:47.665053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82vz9" event={"ID":"2b81c3bf-499d-48bd-869b-671fefa1ba81","Type":"ContainerStarted","Data":"60a4e46c53d3c2b51ed6586a5baf5e215ae1a345c51ceb946a1a2ab500544aa7"} Mar 11 12:48:48 crc kubenswrapper[4816]: I0311 12:48:48.424662 4816 scope.go:117] "RemoveContainer" containerID="c588ba0a9276d85151be0b86106d7b0f7a77bf5bc78e6ea0213f1a19b8ad671f" Mar 11 12:48:52 crc kubenswrapper[4816]: I0311 12:48:52.704469 4816 generic.go:334] "Generic (PLEG): container finished" podID="2b81c3bf-499d-48bd-869b-671fefa1ba81" containerID="607956a270a77817bcccd9307e1598f9d1114a7f70ea08d76b9c9c5dbadf188e" exitCode=0 Mar 11 12:48:52 crc kubenswrapper[4816]: I0311 12:48:52.704604 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82vz9" event={"ID":"2b81c3bf-499d-48bd-869b-671fefa1ba81","Type":"ContainerDied","Data":"607956a270a77817bcccd9307e1598f9d1114a7f70ea08d76b9c9c5dbadf188e"} Mar 11 12:48:53 crc kubenswrapper[4816]: I0311 12:48:53.718170 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82vz9" event={"ID":"2b81c3bf-499d-48bd-869b-671fefa1ba81","Type":"ContainerStarted","Data":"487c087a8e1036366e59815f08b96a73c3dd59e78d6a2028fce0c092499692a1"} Mar 11 12:48:53 crc kubenswrapper[4816]: I0311 12:48:53.750477 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-82vz9" podStartSLOduration=2.222062042 podStartE2EDuration="7.750448805s" podCreationTimestamp="2026-03-11 12:48:46 +0000 UTC" firstStartedPulling="2026-03-11 12:48:47.66645674 +0000 UTC m=+3014.257720707" lastFinishedPulling="2026-03-11 12:48:53.194843463 +0000 UTC m=+3019.786107470" observedRunningTime="2026-03-11 12:48:53.744089483 +0000 UTC m=+3020.335353470" watchObservedRunningTime="2026-03-11 12:48:53.750448805 +0000 UTC m=+3020.341712772" Mar 11 12:48:56 crc kubenswrapper[4816]: I0311 12:48:56.987197 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:56 crc kubenswrapper[4816]: I0311 12:48:56.987683 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:57 crc kubenswrapper[4816]: I0311 12:48:57.057812 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:58 crc kubenswrapper[4816]: I0311 12:48:58.131882 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:48:58 crc kubenswrapper[4816]: E0311 12:48:58.132983 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.026928 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.103538 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-82vz9"] Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.204168 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlx2d"] Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.204518 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wlx2d" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="registry-server" containerID="cri-o://034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f" gracePeriod=2 Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.591597 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.774435 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-utilities\") pod \"d456b988-0480-49fc-9667-03c56b871abe\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.774615 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfctf\" (UniqueName: \"kubernetes.io/projected/d456b988-0480-49fc-9667-03c56b871abe-kube-api-access-kfctf\") pod \"d456b988-0480-49fc-9667-03c56b871abe\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.774723 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-catalog-content\") pod \"d456b988-0480-49fc-9667-03c56b871abe\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.774947 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-utilities" (OuterVolumeSpecName: "utilities") pod "d456b988-0480-49fc-9667-03c56b871abe" (UID: "d456b988-0480-49fc-9667-03c56b871abe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.775103 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.781985 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d456b988-0480-49fc-9667-03c56b871abe-kube-api-access-kfctf" (OuterVolumeSpecName: "kube-api-access-kfctf") pod "d456b988-0480-49fc-9667-03c56b871abe" (UID: "d456b988-0480-49fc-9667-03c56b871abe"). InnerVolumeSpecName "kube-api-access-kfctf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.830627 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d456b988-0480-49fc-9667-03c56b871abe" (UID: "d456b988-0480-49fc-9667-03c56b871abe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.842005 4816 generic.go:334] "Generic (PLEG): container finished" podID="d456b988-0480-49fc-9667-03c56b871abe" containerID="034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f" exitCode=0 Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.842041 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlx2d" event={"ID":"d456b988-0480-49fc-9667-03c56b871abe","Type":"ContainerDied","Data":"034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f"} Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.842095 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlx2d" event={"ID":"d456b988-0480-49fc-9667-03c56b871abe","Type":"ContainerDied","Data":"e45e20ab411467d564ca8cae5d6389cdfcd6bd45b4eceaf7b963fe5e1fca9258"} Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.842100 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.842121 4816 scope.go:117] "RemoveContainer" containerID="034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.867506 4816 scope.go:117] "RemoveContainer" containerID="6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.876355 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfctf\" (UniqueName: \"kubernetes.io/projected/d456b988-0480-49fc-9667-03c56b871abe-kube-api-access-kfctf\") on node \"crc\" DevicePath \"\"" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.876398 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.880364 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlx2d"] Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.887072 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wlx2d"] Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.913692 4816 scope.go:117] "RemoveContainer" containerID="a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.933203 4816 scope.go:117] "RemoveContainer" containerID="034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f" Mar 11 12:49:07 crc kubenswrapper[4816]: E0311 12:49:07.933791 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f\": container with ID starting with 034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f not found: ID does not exist" containerID="034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.933828 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f"} err="failed to get container status \"034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f\": rpc error: code = NotFound desc = could not find container \"034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f\": container with ID starting with 034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f not found: ID does not exist" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.933860 4816 scope.go:117] "RemoveContainer" containerID="6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60" Mar 11 12:49:07 crc kubenswrapper[4816]: E0311 12:49:07.934058 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60\": container with ID starting with 6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60 not found: ID does not exist" containerID="6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.934081 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60"} err="failed to get container status \"6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60\": rpc error: code = NotFound desc = could not find container \"6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60\": container with ID starting with 6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60 not found: ID does not exist" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.934096 4816 scope.go:117] "RemoveContainer" containerID="a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf" Mar 11 12:49:07 crc kubenswrapper[4816]: E0311 12:49:07.934305 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf\": container with ID starting with a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf not found: ID does not exist" containerID="a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.934333 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf"} err="failed to get container status \"a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf\": rpc error: code = NotFound desc = could not find container \"a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf\": container with ID starting with a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf not found: ID does not exist" Mar 11 12:49:08 crc kubenswrapper[4816]: I0311 12:49:08.139284 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d456b988-0480-49fc-9667-03c56b871abe" path="/var/lib/kubelet/pods/d456b988-0480-49fc-9667-03c56b871abe/volumes" Mar 11 12:49:12 crc kubenswrapper[4816]: I0311 12:49:12.130835 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:49:12 crc kubenswrapper[4816]: E0311 12:49:12.131851 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:49:24 crc kubenswrapper[4816]: I0311 12:49:24.130827 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:49:24 crc kubenswrapper[4816]: E0311 12:49:24.131585 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:49:37 crc kubenswrapper[4816]: I0311 12:49:37.131684 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:49:37 crc kubenswrapper[4816]: E0311 12:49:37.133402 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:49:51 crc kubenswrapper[4816]: I0311 12:49:51.131764 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:49:51 crc kubenswrapper[4816]: E0311 12:49:51.132997 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.158515 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553890-4jjhs"] Mar 11 12:50:00 crc kubenswrapper[4816]: E0311 12:50:00.159517 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="extract-utilities" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.159540 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="extract-utilities" Mar 11 12:50:00 crc kubenswrapper[4816]: E0311 12:50:00.159591 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="extract-content" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.159603 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="extract-content" Mar 11 12:50:00 crc kubenswrapper[4816]: E0311 12:50:00.159630 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="registry-server" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.159641 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="registry-server" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.159834 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="registry-server" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.160492 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553890-4jjhs" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.163471 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.163587 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.163700 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.167097 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553890-4jjhs"] Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.268060 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47jtb\" (UniqueName: \"kubernetes.io/projected/3ba83d28-3266-48ec-a66b-256e07e427c4-kube-api-access-47jtb\") pod \"auto-csr-approver-29553890-4jjhs\" (UID: \"3ba83d28-3266-48ec-a66b-256e07e427c4\") " pod="openshift-infra/auto-csr-approver-29553890-4jjhs" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.368885 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47jtb\" (UniqueName: \"kubernetes.io/projected/3ba83d28-3266-48ec-a66b-256e07e427c4-kube-api-access-47jtb\") pod \"auto-csr-approver-29553890-4jjhs\" (UID: \"3ba83d28-3266-48ec-a66b-256e07e427c4\") " pod="openshift-infra/auto-csr-approver-29553890-4jjhs" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.390109 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47jtb\" (UniqueName: \"kubernetes.io/projected/3ba83d28-3266-48ec-a66b-256e07e427c4-kube-api-access-47jtb\") pod \"auto-csr-approver-29553890-4jjhs\" (UID: \"3ba83d28-3266-48ec-a66b-256e07e427c4\") " pod="openshift-infra/auto-csr-approver-29553890-4jjhs" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.483290 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553890-4jjhs" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.905355 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553890-4jjhs"] Mar 11 12:50:01 crc kubenswrapper[4816]: I0311 12:50:01.303774 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553890-4jjhs" event={"ID":"3ba83d28-3266-48ec-a66b-256e07e427c4","Type":"ContainerStarted","Data":"276f6ef42121ac14bfdd12c54816b043fcc5d41dc9444b9b53c34e8ce6a3b4d0"} Mar 11 12:50:03 crc kubenswrapper[4816]: I0311 12:50:03.321791 4816 generic.go:334] "Generic (PLEG): container finished" podID="3ba83d28-3266-48ec-a66b-256e07e427c4" containerID="24d57408e6d94ff8c7de8f3b9883efb12d44570ac11551e03845bb25056d71b0" exitCode=0 Mar 11 12:50:03 crc kubenswrapper[4816]: I0311 12:50:03.321908 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553890-4jjhs" event={"ID":"3ba83d28-3266-48ec-a66b-256e07e427c4","Type":"ContainerDied","Data":"24d57408e6d94ff8c7de8f3b9883efb12d44570ac11551e03845bb25056d71b0"} Mar 11 12:50:04 crc kubenswrapper[4816]: I0311 12:50:04.136391 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:50:04 crc kubenswrapper[4816]: E0311 12:50:04.137075 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:50:04 crc kubenswrapper[4816]: I0311 12:50:04.647441 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553890-4jjhs" Mar 11 12:50:04 crc kubenswrapper[4816]: I0311 12:50:04.838272 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47jtb\" (UniqueName: \"kubernetes.io/projected/3ba83d28-3266-48ec-a66b-256e07e427c4-kube-api-access-47jtb\") pod \"3ba83d28-3266-48ec-a66b-256e07e427c4\" (UID: \"3ba83d28-3266-48ec-a66b-256e07e427c4\") " Mar 11 12:50:04 crc kubenswrapper[4816]: I0311 12:50:04.848499 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ba83d28-3266-48ec-a66b-256e07e427c4-kube-api-access-47jtb" (OuterVolumeSpecName: "kube-api-access-47jtb") pod "3ba83d28-3266-48ec-a66b-256e07e427c4" (UID: "3ba83d28-3266-48ec-a66b-256e07e427c4"). InnerVolumeSpecName "kube-api-access-47jtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:50:04 crc kubenswrapper[4816]: I0311 12:50:04.940273 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47jtb\" (UniqueName: \"kubernetes.io/projected/3ba83d28-3266-48ec-a66b-256e07e427c4-kube-api-access-47jtb\") on node \"crc\" DevicePath \"\"" Mar 11 12:50:05 crc kubenswrapper[4816]: I0311 12:50:05.342836 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553890-4jjhs" event={"ID":"3ba83d28-3266-48ec-a66b-256e07e427c4","Type":"ContainerDied","Data":"276f6ef42121ac14bfdd12c54816b043fcc5d41dc9444b9b53c34e8ce6a3b4d0"} Mar 11 12:50:05 crc kubenswrapper[4816]: I0311 12:50:05.342894 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="276f6ef42121ac14bfdd12c54816b043fcc5d41dc9444b9b53c34e8ce6a3b4d0" Mar 11 12:50:05 crc kubenswrapper[4816]: I0311 12:50:05.342910 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553890-4jjhs" Mar 11 12:50:05 crc kubenswrapper[4816]: I0311 12:50:05.727358 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553884-t7cv9"] Mar 11 12:50:05 crc kubenswrapper[4816]: I0311 12:50:05.736756 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553884-t7cv9"] Mar 11 12:50:06 crc kubenswrapper[4816]: I0311 12:50:06.141054 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7" path="/var/lib/kubelet/pods/a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7/volumes" Mar 11 12:50:18 crc kubenswrapper[4816]: I0311 12:50:18.130670 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:50:18 crc kubenswrapper[4816]: E0311 12:50:18.131637 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:50:29 crc kubenswrapper[4816]: I0311 12:50:29.131219 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:50:29 crc kubenswrapper[4816]: E0311 12:50:29.132130 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:50:40 crc kubenswrapper[4816]: I0311 12:50:40.132150 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:50:40 crc kubenswrapper[4816]: E0311 12:50:40.133022 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:50:48 crc kubenswrapper[4816]: I0311 12:50:48.524286 4816 scope.go:117] "RemoveContainer" containerID="da680962b6fbbd0e75bc32153dd7114d5c7dd1b60db6d2fbbedf1eb60245a10a" Mar 11 12:50:53 crc kubenswrapper[4816]: I0311 12:50:53.132054 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:50:53 crc kubenswrapper[4816]: E0311 12:50:53.133233 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:51:04 crc kubenswrapper[4816]: I0311 12:51:04.135230 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:51:04 crc kubenswrapper[4816]: E0311 12:51:04.136174 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:51:17 crc kubenswrapper[4816]: I0311 12:51:17.131796 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:51:17 crc kubenswrapper[4816]: E0311 12:51:17.133627 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:51:29 crc kubenswrapper[4816]: I0311 12:51:29.130712 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:51:29 crc kubenswrapper[4816]: E0311 12:51:29.132101 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:51:42 crc kubenswrapper[4816]: I0311 12:51:42.130317 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:51:42 crc kubenswrapper[4816]: E0311 12:51:42.132946 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:51:57 crc kubenswrapper[4816]: I0311 12:51:57.131598 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:51:57 crc kubenswrapper[4816]: E0311 12:51:57.132654 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.184065 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553892-p8pvb"] Mar 11 12:52:00 crc kubenswrapper[4816]: E0311 12:52:00.184923 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba83d28-3266-48ec-a66b-256e07e427c4" containerName="oc" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.184939 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba83d28-3266-48ec-a66b-256e07e427c4" containerName="oc" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.185170 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ba83d28-3266-48ec-a66b-256e07e427c4" containerName="oc" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.185808 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553892-p8pvb" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.193774 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.194075 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.194165 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.196613 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553892-p8pvb"] Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.296867 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxbbh\" (UniqueName: \"kubernetes.io/projected/0930f466-9688-4eee-a82d-54a22a037535-kube-api-access-nxbbh\") pod \"auto-csr-approver-29553892-p8pvb\" (UID: \"0930f466-9688-4eee-a82d-54a22a037535\") " pod="openshift-infra/auto-csr-approver-29553892-p8pvb" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.399046 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxbbh\" (UniqueName: \"kubernetes.io/projected/0930f466-9688-4eee-a82d-54a22a037535-kube-api-access-nxbbh\") pod \"auto-csr-approver-29553892-p8pvb\" (UID: \"0930f466-9688-4eee-a82d-54a22a037535\") " pod="openshift-infra/auto-csr-approver-29553892-p8pvb" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.426071 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxbbh\" (UniqueName: \"kubernetes.io/projected/0930f466-9688-4eee-a82d-54a22a037535-kube-api-access-nxbbh\") pod \"auto-csr-approver-29553892-p8pvb\" (UID: \"0930f466-9688-4eee-a82d-54a22a037535\") " pod="openshift-infra/auto-csr-approver-29553892-p8pvb" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.519949 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553892-p8pvb" Mar 11 12:52:01 crc kubenswrapper[4816]: I0311 12:52:01.020648 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553892-p8pvb"] Mar 11 12:52:01 crc kubenswrapper[4816]: I0311 12:52:01.027934 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:52:01 crc kubenswrapper[4816]: I0311 12:52:01.889515 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553892-p8pvb" event={"ID":"0930f466-9688-4eee-a82d-54a22a037535","Type":"ContainerStarted","Data":"0e5e84f0f22d19d1a1cc75bdc126647c2800a3edd6da5c83b52ecd79fd137b37"} Mar 11 12:52:02 crc kubenswrapper[4816]: I0311 12:52:02.902105 4816 generic.go:334] "Generic (PLEG): container finished" podID="0930f466-9688-4eee-a82d-54a22a037535" containerID="b23f59795fc03fe5ae5f308d14da26d3250f022f9dd89c94c78eb50bf14fca19" exitCode=0 Mar 11 12:52:02 crc kubenswrapper[4816]: I0311 12:52:02.902186 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553892-p8pvb" event={"ID":"0930f466-9688-4eee-a82d-54a22a037535","Type":"ContainerDied","Data":"b23f59795fc03fe5ae5f308d14da26d3250f022f9dd89c94c78eb50bf14fca19"} Mar 11 12:52:04 crc kubenswrapper[4816]: I0311 12:52:04.322878 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553892-p8pvb" Mar 11 12:52:04 crc kubenswrapper[4816]: I0311 12:52:04.475965 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxbbh\" (UniqueName: \"kubernetes.io/projected/0930f466-9688-4eee-a82d-54a22a037535-kube-api-access-nxbbh\") pod \"0930f466-9688-4eee-a82d-54a22a037535\" (UID: \"0930f466-9688-4eee-a82d-54a22a037535\") " Mar 11 12:52:04 crc kubenswrapper[4816]: I0311 12:52:04.483769 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0930f466-9688-4eee-a82d-54a22a037535-kube-api-access-nxbbh" (OuterVolumeSpecName: "kube-api-access-nxbbh") pod "0930f466-9688-4eee-a82d-54a22a037535" (UID: "0930f466-9688-4eee-a82d-54a22a037535"). InnerVolumeSpecName "kube-api-access-nxbbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:52:04 crc kubenswrapper[4816]: I0311 12:52:04.577862 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxbbh\" (UniqueName: \"kubernetes.io/projected/0930f466-9688-4eee-a82d-54a22a037535-kube-api-access-nxbbh\") on node \"crc\" DevicePath \"\"" Mar 11 12:52:04 crc kubenswrapper[4816]: I0311 12:52:04.919679 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553892-p8pvb" event={"ID":"0930f466-9688-4eee-a82d-54a22a037535","Type":"ContainerDied","Data":"0e5e84f0f22d19d1a1cc75bdc126647c2800a3edd6da5c83b52ecd79fd137b37"} Mar 11 12:52:04 crc kubenswrapper[4816]: I0311 12:52:04.919772 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e5e84f0f22d19d1a1cc75bdc126647c2800a3edd6da5c83b52ecd79fd137b37" Mar 11 12:52:04 crc kubenswrapper[4816]: I0311 12:52:04.919804 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553892-p8pvb" Mar 11 12:52:05 crc kubenswrapper[4816]: I0311 12:52:05.388483 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553886-xjtpg"] Mar 11 12:52:05 crc kubenswrapper[4816]: I0311 12:52:05.394063 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553886-xjtpg"] Mar 11 12:52:06 crc kubenswrapper[4816]: I0311 12:52:06.140805 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10793112-fba0-46e4-a3a5-201255a72221" path="/var/lib/kubelet/pods/10793112-fba0-46e4-a3a5-201255a72221/volumes" Mar 11 12:52:12 crc kubenswrapper[4816]: I0311 12:52:12.130899 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:52:12 crc kubenswrapper[4816]: E0311 12:52:12.131470 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:52:26 crc kubenswrapper[4816]: I0311 12:52:26.131227 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:52:26 crc kubenswrapper[4816]: E0311 12:52:26.132898 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:52:38 crc kubenswrapper[4816]: I0311 12:52:38.130637 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:52:38 crc kubenswrapper[4816]: E0311 12:52:38.131764 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:52:48 crc kubenswrapper[4816]: I0311 12:52:48.638376 4816 scope.go:117] "RemoveContainer" containerID="015da5072b60f8b74ef45c6695076cb6f089d3139f01fe3ef7f7d86d8236f381" Mar 11 12:52:51 crc kubenswrapper[4816]: I0311 12:52:51.130571 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:52:51 crc kubenswrapper[4816]: E0311 12:52:51.131280 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:53:04 crc kubenswrapper[4816]: I0311 12:53:04.135135 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:53:04 crc kubenswrapper[4816]: E0311 12:53:04.136233 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:53:16 crc kubenswrapper[4816]: I0311 12:53:16.130858 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:53:16 crc kubenswrapper[4816]: I0311 12:53:16.504116 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"82969b44556ee232154d78ccdc1672ee0b8f8d60f9110d4d7c57547eaa3f598d"} Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.781495 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-47gxf"] Mar 11 12:53:50 crc kubenswrapper[4816]: E0311 12:53:50.782389 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0930f466-9688-4eee-a82d-54a22a037535" containerName="oc" Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.782407 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0930f466-9688-4eee-a82d-54a22a037535" containerName="oc" Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.782581 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="0930f466-9688-4eee-a82d-54a22a037535" containerName="oc" Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.783707 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.806049 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-47gxf"] Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.934171 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-utilities\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.934232 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-catalog-content\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.934306 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dll6v\" (UniqueName: \"kubernetes.io/projected/46b8010d-316f-425c-9e3a-69f771cc81a5-kube-api-access-dll6v\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.037149 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dll6v\" (UniqueName: \"kubernetes.io/projected/46b8010d-316f-425c-9e3a-69f771cc81a5-kube-api-access-dll6v\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.037342 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-utilities\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.037375 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-catalog-content\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.037993 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-catalog-content\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.038198 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-utilities\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.070270 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dll6v\" (UniqueName: \"kubernetes.io/projected/46b8010d-316f-425c-9e3a-69f771cc81a5-kube-api-access-dll6v\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.127690 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.591781 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-47gxf"] Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.810992 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerStarted","Data":"3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2"} Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.811044 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerStarted","Data":"418b4f749c0e0bfdb4875242a4075c3d65786f340cf1d80b61c4a416d7f3f702"} Mar 11 12:53:52 crc kubenswrapper[4816]: I0311 12:53:52.818458 4816 generic.go:334] "Generic (PLEG): container finished" podID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerID="3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2" exitCode=0 Mar 11 12:53:52 crc kubenswrapper[4816]: I0311 12:53:52.818572 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerDied","Data":"3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2"} Mar 11 12:53:53 crc kubenswrapper[4816]: I0311 12:53:53.829054 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerStarted","Data":"0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607"} Mar 11 12:53:54 crc kubenswrapper[4816]: I0311 12:53:54.839231 4816 generic.go:334] "Generic (PLEG): container finished" podID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerID="0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607" exitCode=0 Mar 11 12:53:54 crc kubenswrapper[4816]: I0311 12:53:54.839359 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerDied","Data":"0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607"} Mar 11 12:53:55 crc kubenswrapper[4816]: I0311 12:53:55.855731 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerStarted","Data":"14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554"} Mar 11 12:53:55 crc kubenswrapper[4816]: I0311 12:53:55.882564 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-47gxf" podStartSLOduration=3.200894044 podStartE2EDuration="5.882531668s" podCreationTimestamp="2026-03-11 12:53:50 +0000 UTC" firstStartedPulling="2026-03-11 12:53:52.820261794 +0000 UTC m=+3319.411525761" lastFinishedPulling="2026-03-11 12:53:55.501899418 +0000 UTC m=+3322.093163385" observedRunningTime="2026-03-11 12:53:55.874229816 +0000 UTC m=+3322.465493783" watchObservedRunningTime="2026-03-11 12:53:55.882531668 +0000 UTC m=+3322.473795635" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.154682 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553894-48d29"] Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.156534 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553894-48d29" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.159462 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.159581 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.159880 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.168419 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553894-48d29"] Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.290938 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dglvc\" (UniqueName: \"kubernetes.io/projected/ca38c432-abfd-4e1c-8ea7-a0781390bb1d-kube-api-access-dglvc\") pod \"auto-csr-approver-29553894-48d29\" (UID: \"ca38c432-abfd-4e1c-8ea7-a0781390bb1d\") " pod="openshift-infra/auto-csr-approver-29553894-48d29" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.392783 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dglvc\" (UniqueName: \"kubernetes.io/projected/ca38c432-abfd-4e1c-8ea7-a0781390bb1d-kube-api-access-dglvc\") pod \"auto-csr-approver-29553894-48d29\" (UID: \"ca38c432-abfd-4e1c-8ea7-a0781390bb1d\") " pod="openshift-infra/auto-csr-approver-29553894-48d29" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.413348 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dglvc\" (UniqueName: \"kubernetes.io/projected/ca38c432-abfd-4e1c-8ea7-a0781390bb1d-kube-api-access-dglvc\") pod \"auto-csr-approver-29553894-48d29\" (UID: \"ca38c432-abfd-4e1c-8ea7-a0781390bb1d\") " pod="openshift-infra/auto-csr-approver-29553894-48d29" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.476418 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553894-48d29" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.928224 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553894-48d29"] Mar 11 12:54:00 crc kubenswrapper[4816]: W0311 12:54:00.938078 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca38c432_abfd_4e1c_8ea7_a0781390bb1d.slice/crio-2d455ef0cff5c60692ee00dc059c3e95a0c6daec4f3b1e7c64274d0cd5d2df25 WatchSource:0}: Error finding container 2d455ef0cff5c60692ee00dc059c3e95a0c6daec4f3b1e7c64274d0cd5d2df25: Status 404 returned error can't find the container with id 2d455ef0cff5c60692ee00dc059c3e95a0c6daec4f3b1e7c64274d0cd5d2df25 Mar 11 12:54:01 crc kubenswrapper[4816]: I0311 12:54:01.128672 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:54:01 crc kubenswrapper[4816]: I0311 12:54:01.128733 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:54:01 crc kubenswrapper[4816]: I0311 12:54:01.906051 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553894-48d29" event={"ID":"ca38c432-abfd-4e1c-8ea7-a0781390bb1d","Type":"ContainerStarted","Data":"2d455ef0cff5c60692ee00dc059c3e95a0c6daec4f3b1e7c64274d0cd5d2df25"} Mar 11 12:54:02 crc kubenswrapper[4816]: I0311 12:54:02.175868 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-47gxf" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="registry-server" probeResult="failure" output=< Mar 11 12:54:02 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:54:02 crc kubenswrapper[4816]: > Mar 11 12:54:02 crc kubenswrapper[4816]: I0311 12:54:02.917851 4816 generic.go:334] "Generic (PLEG): container finished" podID="ca38c432-abfd-4e1c-8ea7-a0781390bb1d" containerID="f4e7fa686a33e8ded3e5e43526bdf4db23b8a91e490e0b84842982390eec6764" exitCode=0 Mar 11 12:54:02 crc kubenswrapper[4816]: I0311 12:54:02.918101 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553894-48d29" event={"ID":"ca38c432-abfd-4e1c-8ea7-a0781390bb1d","Type":"ContainerDied","Data":"f4e7fa686a33e8ded3e5e43526bdf4db23b8a91e490e0b84842982390eec6764"} Mar 11 12:54:04 crc kubenswrapper[4816]: I0311 12:54:04.264755 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553894-48d29" Mar 11 12:54:04 crc kubenswrapper[4816]: I0311 12:54:04.372337 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dglvc\" (UniqueName: \"kubernetes.io/projected/ca38c432-abfd-4e1c-8ea7-a0781390bb1d-kube-api-access-dglvc\") pod \"ca38c432-abfd-4e1c-8ea7-a0781390bb1d\" (UID: \"ca38c432-abfd-4e1c-8ea7-a0781390bb1d\") " Mar 11 12:54:04 crc kubenswrapper[4816]: I0311 12:54:04.378713 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca38c432-abfd-4e1c-8ea7-a0781390bb1d-kube-api-access-dglvc" (OuterVolumeSpecName: "kube-api-access-dglvc") pod "ca38c432-abfd-4e1c-8ea7-a0781390bb1d" (UID: "ca38c432-abfd-4e1c-8ea7-a0781390bb1d"). InnerVolumeSpecName "kube-api-access-dglvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:54:04 crc kubenswrapper[4816]: I0311 12:54:04.475142 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dglvc\" (UniqueName: \"kubernetes.io/projected/ca38c432-abfd-4e1c-8ea7-a0781390bb1d-kube-api-access-dglvc\") on node \"crc\" DevicePath \"\"" Mar 11 12:54:04 crc kubenswrapper[4816]: I0311 12:54:04.940136 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553894-48d29" event={"ID":"ca38c432-abfd-4e1c-8ea7-a0781390bb1d","Type":"ContainerDied","Data":"2d455ef0cff5c60692ee00dc059c3e95a0c6daec4f3b1e7c64274d0cd5d2df25"} Mar 11 12:54:04 crc kubenswrapper[4816]: I0311 12:54:04.940270 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d455ef0cff5c60692ee00dc059c3e95a0c6daec4f3b1e7c64274d0cd5d2df25" Mar 11 12:54:04 crc kubenswrapper[4816]: I0311 12:54:04.940319 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553894-48d29" Mar 11 12:54:05 crc kubenswrapper[4816]: I0311 12:54:05.352939 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553888-7sw8h"] Mar 11 12:54:05 crc kubenswrapper[4816]: I0311 12:54:05.359024 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553888-7sw8h"] Mar 11 12:54:06 crc kubenswrapper[4816]: I0311 12:54:06.147402 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a5e0fe-c52b-4b6f-ab13-ba73fce64177" path="/var/lib/kubelet/pods/f3a5e0fe-c52b-4b6f-ab13-ba73fce64177/volumes" Mar 11 12:54:11 crc kubenswrapper[4816]: I0311 12:54:11.178680 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:54:11 crc kubenswrapper[4816]: I0311 12:54:11.225784 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:54:11 crc kubenswrapper[4816]: I0311 12:54:11.415547 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-47gxf"] Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.011966 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-47gxf" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="registry-server" containerID="cri-o://14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554" gracePeriod=2 Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.457063 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.544484 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dll6v\" (UniqueName: \"kubernetes.io/projected/46b8010d-316f-425c-9e3a-69f771cc81a5-kube-api-access-dll6v\") pod \"46b8010d-316f-425c-9e3a-69f771cc81a5\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.544632 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-catalog-content\") pod \"46b8010d-316f-425c-9e3a-69f771cc81a5\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.544724 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-utilities\") pod \"46b8010d-316f-425c-9e3a-69f771cc81a5\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.545864 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-utilities" (OuterVolumeSpecName: "utilities") pod "46b8010d-316f-425c-9e3a-69f771cc81a5" (UID: "46b8010d-316f-425c-9e3a-69f771cc81a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.552986 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46b8010d-316f-425c-9e3a-69f771cc81a5-kube-api-access-dll6v" (OuterVolumeSpecName: "kube-api-access-dll6v") pod "46b8010d-316f-425c-9e3a-69f771cc81a5" (UID: "46b8010d-316f-425c-9e3a-69f771cc81a5"). InnerVolumeSpecName "kube-api-access-dll6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.646745 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.646790 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dll6v\" (UniqueName: \"kubernetes.io/projected/46b8010d-316f-425c-9e3a-69f771cc81a5-kube-api-access-dll6v\") on node \"crc\" DevicePath \"\"" Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.699426 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46b8010d-316f-425c-9e3a-69f771cc81a5" (UID: "46b8010d-316f-425c-9e3a-69f771cc81a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.748678 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.028062 4816 generic.go:334] "Generic (PLEG): container finished" podID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerID="14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554" exitCode=0 Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.028131 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerDied","Data":"14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554"} Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.028213 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerDied","Data":"418b4f749c0e0bfdb4875242a4075c3d65786f340cf1d80b61c4a416d7f3f702"} Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.028261 4816 scope.go:117] "RemoveContainer" containerID="14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.029726 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.065603 4816 scope.go:117] "RemoveContainer" containerID="0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.077329 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-47gxf"] Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.086888 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-47gxf"] Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.088923 4816 scope.go:117] "RemoveContainer" containerID="3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.109736 4816 scope.go:117] "RemoveContainer" containerID="14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554" Mar 11 12:54:14 crc kubenswrapper[4816]: E0311 12:54:14.110870 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554\": container with ID starting with 14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554 not found: ID does not exist" containerID="14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.110927 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554"} err="failed to get container status \"14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554\": rpc error: code = NotFound desc = could not find container \"14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554\": container with ID starting with 14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554 not found: ID does not exist" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.110968 4816 scope.go:117] "RemoveContainer" containerID="0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607" Mar 11 12:54:14 crc kubenswrapper[4816]: E0311 12:54:14.111486 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607\": container with ID starting with 0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607 not found: ID does not exist" containerID="0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.111508 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607"} err="failed to get container status \"0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607\": rpc error: code = NotFound desc = could not find container \"0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607\": container with ID starting with 0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607 not found: ID does not exist" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.111523 4816 scope.go:117] "RemoveContainer" containerID="3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2" Mar 11 12:54:14 crc kubenswrapper[4816]: E0311 12:54:14.111760 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2\": container with ID starting with 3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2 not found: ID does not exist" containerID="3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.111785 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2"} err="failed to get container status \"3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2\": rpc error: code = NotFound desc = could not find container \"3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2\": container with ID starting with 3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2 not found: ID does not exist" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.141124 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" path="/var/lib/kubelet/pods/46b8010d-316f-425c-9e3a-69f771cc81a5/volumes" Mar 11 12:54:48 crc kubenswrapper[4816]: I0311 12:54:48.722613 4816 scope.go:117] "RemoveContainer" containerID="9ed1a5e43552ff0476bd301f6f56de7c0e4f936f582bd894ea6e5569ba2db74d" Mar 11 12:55:39 crc kubenswrapper[4816]: I0311 12:55:39.515392 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:55:39 crc kubenswrapper[4816]: I0311 12:55:39.516165 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.153735 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553896-lxt87"] Mar 11 12:56:00 crc kubenswrapper[4816]: E0311 12:56:00.154965 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="extract-content" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.154995 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="extract-content" Mar 11 12:56:00 crc kubenswrapper[4816]: E0311 12:56:00.155014 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca38c432-abfd-4e1c-8ea7-a0781390bb1d" containerName="oc" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.155028 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca38c432-abfd-4e1c-8ea7-a0781390bb1d" containerName="oc" Mar 11 12:56:00 crc kubenswrapper[4816]: E0311 12:56:00.155066 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="registry-server" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.155081 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="registry-server" Mar 11 12:56:00 crc kubenswrapper[4816]: E0311 12:56:00.155114 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="extract-utilities" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.155126 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="extract-utilities" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.155421 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca38c432-abfd-4e1c-8ea7-a0781390bb1d" containerName="oc" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.155463 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="registry-server" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.156341 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553896-lxt87" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.162038 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553896-lxt87"] Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.173966 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.178865 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.179138 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.293907 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shlvh\" (UniqueName: \"kubernetes.io/projected/2438ebe2-3bab-42fc-9430-8b2600a2efd1-kube-api-access-shlvh\") pod \"auto-csr-approver-29553896-lxt87\" (UID: \"2438ebe2-3bab-42fc-9430-8b2600a2efd1\") " pod="openshift-infra/auto-csr-approver-29553896-lxt87" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.395087 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shlvh\" (UniqueName: \"kubernetes.io/projected/2438ebe2-3bab-42fc-9430-8b2600a2efd1-kube-api-access-shlvh\") pod \"auto-csr-approver-29553896-lxt87\" (UID: \"2438ebe2-3bab-42fc-9430-8b2600a2efd1\") " pod="openshift-infra/auto-csr-approver-29553896-lxt87" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.422185 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shlvh\" (UniqueName: \"kubernetes.io/projected/2438ebe2-3bab-42fc-9430-8b2600a2efd1-kube-api-access-shlvh\") pod \"auto-csr-approver-29553896-lxt87\" (UID: \"2438ebe2-3bab-42fc-9430-8b2600a2efd1\") " pod="openshift-infra/auto-csr-approver-29553896-lxt87" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.486802 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553896-lxt87" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.902090 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553896-lxt87"] Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.954554 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553896-lxt87" event={"ID":"2438ebe2-3bab-42fc-9430-8b2600a2efd1","Type":"ContainerStarted","Data":"e5a7905530685f7effb3274c90abdf63d08adb1d15899784b209bb6b6b52b86a"} Mar 11 12:56:02 crc kubenswrapper[4816]: I0311 12:56:02.979540 4816 generic.go:334] "Generic (PLEG): container finished" podID="2438ebe2-3bab-42fc-9430-8b2600a2efd1" containerID="78e7b5f4d85a6eb5009a8b4bbf6ee9389d9e1a205f3bf787bb243af2af671b70" exitCode=0 Mar 11 12:56:02 crc kubenswrapper[4816]: I0311 12:56:02.980383 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553896-lxt87" event={"ID":"2438ebe2-3bab-42fc-9430-8b2600a2efd1","Type":"ContainerDied","Data":"78e7b5f4d85a6eb5009a8b4bbf6ee9389d9e1a205f3bf787bb243af2af671b70"} Mar 11 12:56:04 crc kubenswrapper[4816]: I0311 12:56:04.371404 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553896-lxt87" Mar 11 12:56:04 crc kubenswrapper[4816]: I0311 12:56:04.471146 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shlvh\" (UniqueName: \"kubernetes.io/projected/2438ebe2-3bab-42fc-9430-8b2600a2efd1-kube-api-access-shlvh\") pod \"2438ebe2-3bab-42fc-9430-8b2600a2efd1\" (UID: \"2438ebe2-3bab-42fc-9430-8b2600a2efd1\") " Mar 11 12:56:04 crc kubenswrapper[4816]: I0311 12:56:04.477716 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2438ebe2-3bab-42fc-9430-8b2600a2efd1-kube-api-access-shlvh" (OuterVolumeSpecName: "kube-api-access-shlvh") pod "2438ebe2-3bab-42fc-9430-8b2600a2efd1" (UID: "2438ebe2-3bab-42fc-9430-8b2600a2efd1"). InnerVolumeSpecName "kube-api-access-shlvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:56:04 crc kubenswrapper[4816]: I0311 12:56:04.573110 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shlvh\" (UniqueName: \"kubernetes.io/projected/2438ebe2-3bab-42fc-9430-8b2600a2efd1-kube-api-access-shlvh\") on node \"crc\" DevicePath \"\"" Mar 11 12:56:05 crc kubenswrapper[4816]: I0311 12:56:05.002104 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553896-lxt87" event={"ID":"2438ebe2-3bab-42fc-9430-8b2600a2efd1","Type":"ContainerDied","Data":"e5a7905530685f7effb3274c90abdf63d08adb1d15899784b209bb6b6b52b86a"} Mar 11 12:56:05 crc kubenswrapper[4816]: I0311 12:56:05.002187 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553896-lxt87" Mar 11 12:56:05 crc kubenswrapper[4816]: I0311 12:56:05.002205 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5a7905530685f7effb3274c90abdf63d08adb1d15899784b209bb6b6b52b86a" Mar 11 12:56:05 crc kubenswrapper[4816]: I0311 12:56:05.452961 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553890-4jjhs"] Mar 11 12:56:05 crc kubenswrapper[4816]: I0311 12:56:05.453043 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553890-4jjhs"] Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.145345 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ba83d28-3266-48ec-a66b-256e07e427c4" path="/var/lib/kubelet/pods/3ba83d28-3266-48ec-a66b-256e07e427c4/volumes" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.553922 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gwmbx"] Mar 11 12:56:06 crc kubenswrapper[4816]: E0311 12:56:06.554375 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2438ebe2-3bab-42fc-9430-8b2600a2efd1" containerName="oc" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.554392 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2438ebe2-3bab-42fc-9430-8b2600a2efd1" containerName="oc" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.554587 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2438ebe2-3bab-42fc-9430-8b2600a2efd1" containerName="oc" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.555814 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.573374 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gwmbx"] Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.705214 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-catalog-content\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.705314 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcm9z\" (UniqueName: \"kubernetes.io/projected/48071a6c-027a-4069-8c66-49fe9309a163-kube-api-access-lcm9z\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.705576 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-utilities\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.807480 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcm9z\" (UniqueName: \"kubernetes.io/projected/48071a6c-027a-4069-8c66-49fe9309a163-kube-api-access-lcm9z\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.807591 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-utilities\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.807668 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-catalog-content\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.808207 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-utilities\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.808261 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-catalog-content\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.833312 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcm9z\" (UniqueName: \"kubernetes.io/projected/48071a6c-027a-4069-8c66-49fe9309a163-kube-api-access-lcm9z\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.879288 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:07 crc kubenswrapper[4816]: I0311 12:56:07.216075 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gwmbx"] Mar 11 12:56:07 crc kubenswrapper[4816]: E0311 12:56:07.650017 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48071a6c_027a_4069_8c66_49fe9309a163.slice/crio-conmon-54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3.scope\": RecentStats: unable to find data in memory cache]" Mar 11 12:56:08 crc kubenswrapper[4816]: I0311 12:56:08.030310 4816 generic.go:334] "Generic (PLEG): container finished" podID="48071a6c-027a-4069-8c66-49fe9309a163" containerID="54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3" exitCode=0 Mar 11 12:56:08 crc kubenswrapper[4816]: I0311 12:56:08.030434 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwmbx" event={"ID":"48071a6c-027a-4069-8c66-49fe9309a163","Type":"ContainerDied","Data":"54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3"} Mar 11 12:56:08 crc kubenswrapper[4816]: I0311 12:56:08.030872 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwmbx" event={"ID":"48071a6c-027a-4069-8c66-49fe9309a163","Type":"ContainerStarted","Data":"82a7c6ba67dd3a8cd1f47261f2f32235365ba7093768c2ad5c0634571d84ccf9"} Mar 11 12:56:09 crc kubenswrapper[4816]: I0311 12:56:09.041335 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwmbx" event={"ID":"48071a6c-027a-4069-8c66-49fe9309a163","Type":"ContainerStarted","Data":"0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f"} Mar 11 12:56:09 crc kubenswrapper[4816]: I0311 12:56:09.514915 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:56:09 crc kubenswrapper[4816]: I0311 12:56:09.515054 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:56:10 crc kubenswrapper[4816]: I0311 12:56:10.050598 4816 generic.go:334] "Generic (PLEG): container finished" podID="48071a6c-027a-4069-8c66-49fe9309a163" containerID="0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f" exitCode=0 Mar 11 12:56:10 crc kubenswrapper[4816]: I0311 12:56:10.051899 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwmbx" event={"ID":"48071a6c-027a-4069-8c66-49fe9309a163","Type":"ContainerDied","Data":"0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f"} Mar 11 12:56:11 crc kubenswrapper[4816]: I0311 12:56:11.067541 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwmbx" event={"ID":"48071a6c-027a-4069-8c66-49fe9309a163","Type":"ContainerStarted","Data":"aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d"} Mar 11 12:56:11 crc kubenswrapper[4816]: I0311 12:56:11.093267 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gwmbx" podStartSLOduration=2.666236365 podStartE2EDuration="5.093219375s" podCreationTimestamp="2026-03-11 12:56:06 +0000 UTC" firstStartedPulling="2026-03-11 12:56:08.031759143 +0000 UTC m=+3454.623023110" lastFinishedPulling="2026-03-11 12:56:10.458742153 +0000 UTC m=+3457.050006120" observedRunningTime="2026-03-11 12:56:11.089928133 +0000 UTC m=+3457.681192100" watchObservedRunningTime="2026-03-11 12:56:11.093219375 +0000 UTC m=+3457.684483342" Mar 11 12:56:16 crc kubenswrapper[4816]: I0311 12:56:16.879958 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:16 crc kubenswrapper[4816]: I0311 12:56:16.881141 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:16 crc kubenswrapper[4816]: I0311 12:56:16.937738 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:17 crc kubenswrapper[4816]: I0311 12:56:17.193748 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:17 crc kubenswrapper[4816]: I0311 12:56:17.259351 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gwmbx"] Mar 11 12:56:19 crc kubenswrapper[4816]: I0311 12:56:19.139474 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gwmbx" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="registry-server" containerID="cri-o://aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d" gracePeriod=2 Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.100099 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.152260 4816 generic.go:334] "Generic (PLEG): container finished" podID="48071a6c-027a-4069-8c66-49fe9309a163" containerID="aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d" exitCode=0 Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.152402 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.155822 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwmbx" event={"ID":"48071a6c-027a-4069-8c66-49fe9309a163","Type":"ContainerDied","Data":"aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d"} Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.155888 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwmbx" event={"ID":"48071a6c-027a-4069-8c66-49fe9309a163","Type":"ContainerDied","Data":"82a7c6ba67dd3a8cd1f47261f2f32235365ba7093768c2ad5c0634571d84ccf9"} Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.155913 4816 scope.go:117] "RemoveContainer" containerID="aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.174426 4816 scope.go:117] "RemoveContainer" containerID="0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.180117 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-catalog-content\") pod \"48071a6c-027a-4069-8c66-49fe9309a163\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.180459 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-utilities\") pod \"48071a6c-027a-4069-8c66-49fe9309a163\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.180507 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcm9z\" (UniqueName: \"kubernetes.io/projected/48071a6c-027a-4069-8c66-49fe9309a163-kube-api-access-lcm9z\") pod \"48071a6c-027a-4069-8c66-49fe9309a163\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.181554 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-utilities" (OuterVolumeSpecName: "utilities") pod "48071a6c-027a-4069-8c66-49fe9309a163" (UID: "48071a6c-027a-4069-8c66-49fe9309a163"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.190189 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48071a6c-027a-4069-8c66-49fe9309a163-kube-api-access-lcm9z" (OuterVolumeSpecName: "kube-api-access-lcm9z") pod "48071a6c-027a-4069-8c66-49fe9309a163" (UID: "48071a6c-027a-4069-8c66-49fe9309a163"). InnerVolumeSpecName "kube-api-access-lcm9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.194607 4816 scope.go:117] "RemoveContainer" containerID="54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.241552 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48071a6c-027a-4069-8c66-49fe9309a163" (UID: "48071a6c-027a-4069-8c66-49fe9309a163"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.245635 4816 scope.go:117] "RemoveContainer" containerID="aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d" Mar 11 12:56:20 crc kubenswrapper[4816]: E0311 12:56:20.246172 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d\": container with ID starting with aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d not found: ID does not exist" containerID="aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.246241 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d"} err="failed to get container status \"aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d\": rpc error: code = NotFound desc = could not find container \"aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d\": container with ID starting with aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d not found: ID does not exist" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.246310 4816 scope.go:117] "RemoveContainer" containerID="0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f" Mar 11 12:56:20 crc kubenswrapper[4816]: E0311 12:56:20.246698 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f\": container with ID starting with 0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f not found: ID does not exist" containerID="0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.246778 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f"} err="failed to get container status \"0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f\": rpc error: code = NotFound desc = could not find container \"0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f\": container with ID starting with 0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f not found: ID does not exist" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.246813 4816 scope.go:117] "RemoveContainer" containerID="54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3" Mar 11 12:56:20 crc kubenswrapper[4816]: E0311 12:56:20.247093 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3\": container with ID starting with 54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3 not found: ID does not exist" containerID="54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.247124 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3"} err="failed to get container status \"54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3\": rpc error: code = NotFound desc = could not find container \"54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3\": container with ID starting with 54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3 not found: ID does not exist" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.283070 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.283120 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcm9z\" (UniqueName: \"kubernetes.io/projected/48071a6c-027a-4069-8c66-49fe9309a163-kube-api-access-lcm9z\") on node \"crc\" DevicePath \"\"" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.283134 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.496204 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gwmbx"] Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.502663 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gwmbx"] Mar 11 12:56:22 crc kubenswrapper[4816]: I0311 12:56:22.144235 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48071a6c-027a-4069-8c66-49fe9309a163" path="/var/lib/kubelet/pods/48071a6c-027a-4069-8c66-49fe9309a163/volumes" Mar 11 12:56:39 crc kubenswrapper[4816]: I0311 12:56:39.515377 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:56:39 crc kubenswrapper[4816]: I0311 12:56:39.516188 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:56:39 crc kubenswrapper[4816]: I0311 12:56:39.516314 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:56:39 crc kubenswrapper[4816]: I0311 12:56:39.517511 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82969b44556ee232154d78ccdc1672ee0b8f8d60f9110d4d7c57547eaa3f598d"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:56:39 crc kubenswrapper[4816]: I0311 12:56:39.517638 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://82969b44556ee232154d78ccdc1672ee0b8f8d60f9110d4d7c57547eaa3f598d" gracePeriod=600 Mar 11 12:56:40 crc kubenswrapper[4816]: I0311 12:56:40.327551 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="82969b44556ee232154d78ccdc1672ee0b8f8d60f9110d4d7c57547eaa3f598d" exitCode=0 Mar 11 12:56:40 crc kubenswrapper[4816]: I0311 12:56:40.327638 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"82969b44556ee232154d78ccdc1672ee0b8f8d60f9110d4d7c57547eaa3f598d"} Mar 11 12:56:40 crc kubenswrapper[4816]: I0311 12:56:40.328107 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5"} Mar 11 12:56:40 crc kubenswrapper[4816]: I0311 12:56:40.328145 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:56:48 crc kubenswrapper[4816]: I0311 12:56:48.858324 4816 scope.go:117] "RemoveContainer" containerID="24d57408e6d94ff8c7de8f3b9883efb12d44570ac11551e03845bb25056d71b0" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.163286 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553898-tcpck"] Mar 11 12:58:00 crc kubenswrapper[4816]: E0311 12:58:00.164592 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="extract-utilities" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.164616 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="extract-utilities" Mar 11 12:58:00 crc kubenswrapper[4816]: E0311 12:58:00.164645 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="extract-content" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.164654 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="extract-content" Mar 11 12:58:00 crc kubenswrapper[4816]: E0311 12:58:00.164669 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="registry-server" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.164679 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="registry-server" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.164867 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="registry-server" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.165617 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553898-tcpck" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.168862 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.169602 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.170972 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553898-tcpck"] Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.171360 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.273071 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7mk\" (UniqueName: \"kubernetes.io/projected/30f75061-6a64-4c1d-b9f9-77f6425ad4c5-kube-api-access-xc7mk\") pod \"auto-csr-approver-29553898-tcpck\" (UID: \"30f75061-6a64-4c1d-b9f9-77f6425ad4c5\") " pod="openshift-infra/auto-csr-approver-29553898-tcpck" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.374687 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7mk\" (UniqueName: \"kubernetes.io/projected/30f75061-6a64-4c1d-b9f9-77f6425ad4c5-kube-api-access-xc7mk\") pod \"auto-csr-approver-29553898-tcpck\" (UID: \"30f75061-6a64-4c1d-b9f9-77f6425ad4c5\") " pod="openshift-infra/auto-csr-approver-29553898-tcpck" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.396032 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7mk\" (UniqueName: \"kubernetes.io/projected/30f75061-6a64-4c1d-b9f9-77f6425ad4c5-kube-api-access-xc7mk\") pod \"auto-csr-approver-29553898-tcpck\" (UID: \"30f75061-6a64-4c1d-b9f9-77f6425ad4c5\") " pod="openshift-infra/auto-csr-approver-29553898-tcpck" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.486152 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553898-tcpck" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.978733 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.981346 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553898-tcpck"] Mar 11 12:58:01 crc kubenswrapper[4816]: I0311 12:58:01.023321 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553898-tcpck" event={"ID":"30f75061-6a64-4c1d-b9f9-77f6425ad4c5","Type":"ContainerStarted","Data":"0e4d0465c032f5dec6dcec2a7498ded923a9f68afec618eb9cc057d9458104d0"} Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.456217 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zkv9h"] Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.458259 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.466327 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkv9h"] Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.617417 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j7k2\" (UniqueName: \"kubernetes.io/projected/d2ffaffb-3ae6-4747-b335-142213b1f4b5-kube-api-access-2j7k2\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.617543 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-catalog-content\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.617572 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-utilities\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.719734 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j7k2\" (UniqueName: \"kubernetes.io/projected/d2ffaffb-3ae6-4747-b335-142213b1f4b5-kube-api-access-2j7k2\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.719850 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-catalog-content\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.719884 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-utilities\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.720636 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-utilities\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.720678 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-catalog-content\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.749497 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j7k2\" (UniqueName: \"kubernetes.io/projected/d2ffaffb-3ae6-4747-b335-142213b1f4b5-kube-api-access-2j7k2\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.775422 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:03 crc kubenswrapper[4816]: I0311 12:58:03.045617 4816 generic.go:334] "Generic (PLEG): container finished" podID="30f75061-6a64-4c1d-b9f9-77f6425ad4c5" containerID="0c0de876588cbf0205a01555bac817ffc9ad65f6cabe6192282136fce8802326" exitCode=0 Mar 11 12:58:03 crc kubenswrapper[4816]: I0311 12:58:03.045987 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553898-tcpck" event={"ID":"30f75061-6a64-4c1d-b9f9-77f6425ad4c5","Type":"ContainerDied","Data":"0c0de876588cbf0205a01555bac817ffc9ad65f6cabe6192282136fce8802326"} Mar 11 12:58:03 crc kubenswrapper[4816]: I0311 12:58:03.236869 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkv9h"] Mar 11 12:58:04 crc kubenswrapper[4816]: I0311 12:58:04.058262 4816 generic.go:334] "Generic (PLEG): container finished" podID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerID="2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405" exitCode=0 Mar 11 12:58:04 crc kubenswrapper[4816]: I0311 12:58:04.058361 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkv9h" event={"ID":"d2ffaffb-3ae6-4747-b335-142213b1f4b5","Type":"ContainerDied","Data":"2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405"} Mar 11 12:58:04 crc kubenswrapper[4816]: I0311 12:58:04.058455 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkv9h" event={"ID":"d2ffaffb-3ae6-4747-b335-142213b1f4b5","Type":"ContainerStarted","Data":"8599988ebbf02ada5db5c82da2f9c61c4470d15bbd29823853b790abf1c0ac6e"} Mar 11 12:58:04 crc kubenswrapper[4816]: I0311 12:58:04.405569 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553898-tcpck" Mar 11 12:58:04 crc kubenswrapper[4816]: I0311 12:58:04.549305 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc7mk\" (UniqueName: \"kubernetes.io/projected/30f75061-6a64-4c1d-b9f9-77f6425ad4c5-kube-api-access-xc7mk\") pod \"30f75061-6a64-4c1d-b9f9-77f6425ad4c5\" (UID: \"30f75061-6a64-4c1d-b9f9-77f6425ad4c5\") " Mar 11 12:58:04 crc kubenswrapper[4816]: I0311 12:58:04.557153 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f75061-6a64-4c1d-b9f9-77f6425ad4c5-kube-api-access-xc7mk" (OuterVolumeSpecName: "kube-api-access-xc7mk") pod "30f75061-6a64-4c1d-b9f9-77f6425ad4c5" (UID: "30f75061-6a64-4c1d-b9f9-77f6425ad4c5"). InnerVolumeSpecName "kube-api-access-xc7mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:58:04 crc kubenswrapper[4816]: I0311 12:58:04.651676 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc7mk\" (UniqueName: \"kubernetes.io/projected/30f75061-6a64-4c1d-b9f9-77f6425ad4c5-kube-api-access-xc7mk\") on node \"crc\" DevicePath \"\"" Mar 11 12:58:05 crc kubenswrapper[4816]: I0311 12:58:05.067571 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkv9h" event={"ID":"d2ffaffb-3ae6-4747-b335-142213b1f4b5","Type":"ContainerStarted","Data":"66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126"} Mar 11 12:58:05 crc kubenswrapper[4816]: I0311 12:58:05.071354 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553898-tcpck" event={"ID":"30f75061-6a64-4c1d-b9f9-77f6425ad4c5","Type":"ContainerDied","Data":"0e4d0465c032f5dec6dcec2a7498ded923a9f68afec618eb9cc057d9458104d0"} Mar 11 12:58:05 crc kubenswrapper[4816]: I0311 12:58:05.071391 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e4d0465c032f5dec6dcec2a7498ded923a9f68afec618eb9cc057d9458104d0" Mar 11 12:58:05 crc kubenswrapper[4816]: I0311 12:58:05.071441 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553898-tcpck" Mar 11 12:58:05 crc kubenswrapper[4816]: I0311 12:58:05.531952 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553892-p8pvb"] Mar 11 12:58:05 crc kubenswrapper[4816]: I0311 12:58:05.538203 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553892-p8pvb"] Mar 11 12:58:06 crc kubenswrapper[4816]: I0311 12:58:06.094856 4816 generic.go:334] "Generic (PLEG): container finished" podID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerID="66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126" exitCode=0 Mar 11 12:58:06 crc kubenswrapper[4816]: I0311 12:58:06.095477 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkv9h" event={"ID":"d2ffaffb-3ae6-4747-b335-142213b1f4b5","Type":"ContainerDied","Data":"66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126"} Mar 11 12:58:06 crc kubenswrapper[4816]: I0311 12:58:06.141390 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0930f466-9688-4eee-a82d-54a22a037535" path="/var/lib/kubelet/pods/0930f466-9688-4eee-a82d-54a22a037535/volumes" Mar 11 12:58:07 crc kubenswrapper[4816]: I0311 12:58:07.107024 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkv9h" event={"ID":"d2ffaffb-3ae6-4747-b335-142213b1f4b5","Type":"ContainerStarted","Data":"f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8"} Mar 11 12:58:07 crc kubenswrapper[4816]: I0311 12:58:07.133654 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zkv9h" podStartSLOduration=2.690883193 podStartE2EDuration="5.133626414s" podCreationTimestamp="2026-03-11 12:58:02 +0000 UTC" firstStartedPulling="2026-03-11 12:58:04.061608568 +0000 UTC m=+3570.652872545" lastFinishedPulling="2026-03-11 12:58:06.504351799 +0000 UTC m=+3573.095615766" observedRunningTime="2026-03-11 12:58:07.126569162 +0000 UTC m=+3573.717833129" watchObservedRunningTime="2026-03-11 12:58:07.133626414 +0000 UTC m=+3573.724890381" Mar 11 12:58:12 crc kubenswrapper[4816]: I0311 12:58:12.776066 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:12 crc kubenswrapper[4816]: I0311 12:58:12.776563 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:12 crc kubenswrapper[4816]: I0311 12:58:12.844558 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:13 crc kubenswrapper[4816]: I0311 12:58:13.223288 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:13 crc kubenswrapper[4816]: I0311 12:58:13.281341 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkv9h"] Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.191921 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zkv9h" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="registry-server" containerID="cri-o://f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8" gracePeriod=2 Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.677444 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.844058 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-utilities\") pod \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.844222 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-catalog-content\") pod \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.844549 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j7k2\" (UniqueName: \"kubernetes.io/projected/d2ffaffb-3ae6-4747-b335-142213b1f4b5-kube-api-access-2j7k2\") pod \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.845945 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-utilities" (OuterVolumeSpecName: "utilities") pod "d2ffaffb-3ae6-4747-b335-142213b1f4b5" (UID: "d2ffaffb-3ae6-4747-b335-142213b1f4b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.852615 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ffaffb-3ae6-4747-b335-142213b1f4b5-kube-api-access-2j7k2" (OuterVolumeSpecName: "kube-api-access-2j7k2") pod "d2ffaffb-3ae6-4747-b335-142213b1f4b5" (UID: "d2ffaffb-3ae6-4747-b335-142213b1f4b5"). InnerVolumeSpecName "kube-api-access-2j7k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.882976 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2ffaffb-3ae6-4747-b335-142213b1f4b5" (UID: "d2ffaffb-3ae6-4747-b335-142213b1f4b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.946472 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.946505 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.946517 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j7k2\" (UniqueName: \"kubernetes.io/projected/d2ffaffb-3ae6-4747-b335-142213b1f4b5-kube-api-access-2j7k2\") on node \"crc\" DevicePath \"\"" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.209443 4816 generic.go:334] "Generic (PLEG): container finished" podID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerID="f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8" exitCode=0 Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.209552 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkv9h" event={"ID":"d2ffaffb-3ae6-4747-b335-142213b1f4b5","Type":"ContainerDied","Data":"f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8"} Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.209632 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkv9h" event={"ID":"d2ffaffb-3ae6-4747-b335-142213b1f4b5","Type":"ContainerDied","Data":"8599988ebbf02ada5db5c82da2f9c61c4470d15bbd29823853b790abf1c0ac6e"} Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.209677 4816 scope.go:117] "RemoveContainer" containerID="f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.209973 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.245471 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkv9h"] Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.251558 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkv9h"] Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.264818 4816 scope.go:117] "RemoveContainer" containerID="66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.285423 4816 scope.go:117] "RemoveContainer" containerID="2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.315855 4816 scope.go:117] "RemoveContainer" containerID="f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8" Mar 11 12:58:16 crc kubenswrapper[4816]: E0311 12:58:16.316390 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8\": container with ID starting with f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8 not found: ID does not exist" containerID="f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.316434 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8"} err="failed to get container status \"f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8\": rpc error: code = NotFound desc = could not find container \"f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8\": container with ID starting with f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8 not found: ID does not exist" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.316464 4816 scope.go:117] "RemoveContainer" containerID="66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126" Mar 11 12:58:16 crc kubenswrapper[4816]: E0311 12:58:16.316746 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126\": container with ID starting with 66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126 not found: ID does not exist" containerID="66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.316773 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126"} err="failed to get container status \"66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126\": rpc error: code = NotFound desc = could not find container \"66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126\": container with ID starting with 66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126 not found: ID does not exist" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.316788 4816 scope.go:117] "RemoveContainer" containerID="2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405" Mar 11 12:58:16 crc kubenswrapper[4816]: E0311 12:58:16.317127 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405\": container with ID starting with 2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405 not found: ID does not exist" containerID="2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.317163 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405"} err="failed to get container status \"2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405\": rpc error: code = NotFound desc = could not find container \"2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405\": container with ID starting with 2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405 not found: ID does not exist" Mar 11 12:58:18 crc kubenswrapper[4816]: I0311 12:58:18.145705 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" path="/var/lib/kubelet/pods/d2ffaffb-3ae6-4747-b335-142213b1f4b5/volumes" Mar 11 12:58:39 crc kubenswrapper[4816]: I0311 12:58:39.515975 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:58:39 crc kubenswrapper[4816]: I0311 12:58:39.516689 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:58:49 crc kubenswrapper[4816]: I0311 12:58:49.003800 4816 scope.go:117] "RemoveContainer" containerID="b23f59795fc03fe5ae5f308d14da26d3250f022f9dd89c94c78eb50bf14fca19" Mar 11 12:59:09 crc kubenswrapper[4816]: I0311 12:59:09.515868 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:59:09 crc kubenswrapper[4816]: I0311 12:59:09.516873 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:59:39 crc kubenswrapper[4816]: I0311 12:59:39.514784 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:59:39 crc kubenswrapper[4816]: I0311 12:59:39.515327 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:59:39 crc kubenswrapper[4816]: I0311 12:59:39.515377 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:59:39 crc kubenswrapper[4816]: I0311 12:59:39.516031 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:59:39 crc kubenswrapper[4816]: I0311 12:59:39.516090 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" gracePeriod=600 Mar 11 12:59:39 crc kubenswrapper[4816]: E0311 12:59:39.641011 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:59:40 crc kubenswrapper[4816]: I0311 12:59:40.009130 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" exitCode=0 Mar 11 12:59:40 crc kubenswrapper[4816]: I0311 12:59:40.009189 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5"} Mar 11 12:59:40 crc kubenswrapper[4816]: I0311 12:59:40.009231 4816 scope.go:117] "RemoveContainer" containerID="82969b44556ee232154d78ccdc1672ee0b8f8d60f9110d4d7c57547eaa3f598d" Mar 11 12:59:40 crc kubenswrapper[4816]: I0311 12:59:40.009814 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 12:59:40 crc kubenswrapper[4816]: E0311 12:59:40.010043 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:59:55 crc kubenswrapper[4816]: I0311 12:59:55.131348 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 12:59:55 crc kubenswrapper[4816]: E0311 12:59:55.134990 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.153664 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553900-h76nw"] Mar 11 13:00:00 crc kubenswrapper[4816]: E0311 13:00:00.154685 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f75061-6a64-4c1d-b9f9-77f6425ad4c5" containerName="oc" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.154703 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f75061-6a64-4c1d-b9f9-77f6425ad4c5" containerName="oc" Mar 11 13:00:00 crc kubenswrapper[4816]: E0311 13:00:00.154728 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="registry-server" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.154737 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="registry-server" Mar 11 13:00:00 crc kubenswrapper[4816]: E0311 13:00:00.154753 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="extract-content" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.154759 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="extract-content" Mar 11 13:00:00 crc kubenswrapper[4816]: E0311 13:00:00.154779 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="extract-utilities" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.154787 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="extract-utilities" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.154976 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f75061-6a64-4c1d-b9f9-77f6425ad4c5" containerName="oc" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.155014 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="registry-server" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.155657 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553900-h76nw" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.158744 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.159435 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.159655 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.163171 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553900-h76nw"] Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.255379 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq"] Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.256978 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.259556 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.259765 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.264912 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq"] Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.297793 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9klbz\" (UniqueName: \"kubernetes.io/projected/9d463a78-830d-4b86-830a-e70345993927-kube-api-access-9klbz\") pod \"auto-csr-approver-29553900-h76nw\" (UID: \"9d463a78-830d-4b86-830a-e70345993927\") " pod="openshift-infra/auto-csr-approver-29553900-h76nw" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.400020 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f461fc9a-2ced-499e-a8a3-ab129c298ea7-config-volume\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.400449 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk42d\" (UniqueName: \"kubernetes.io/projected/f461fc9a-2ced-499e-a8a3-ab129c298ea7-kube-api-access-kk42d\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.400658 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9klbz\" (UniqueName: \"kubernetes.io/projected/9d463a78-830d-4b86-830a-e70345993927-kube-api-access-9klbz\") pod \"auto-csr-approver-29553900-h76nw\" (UID: \"9d463a78-830d-4b86-830a-e70345993927\") " pod="openshift-infra/auto-csr-approver-29553900-h76nw" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.400724 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f461fc9a-2ced-499e-a8a3-ab129c298ea7-secret-volume\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.422544 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9klbz\" (UniqueName: \"kubernetes.io/projected/9d463a78-830d-4b86-830a-e70345993927-kube-api-access-9klbz\") pod \"auto-csr-approver-29553900-h76nw\" (UID: \"9d463a78-830d-4b86-830a-e70345993927\") " pod="openshift-infra/auto-csr-approver-29553900-h76nw" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.477155 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553900-h76nw" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.502462 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk42d\" (UniqueName: \"kubernetes.io/projected/f461fc9a-2ced-499e-a8a3-ab129c298ea7-kube-api-access-kk42d\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.502529 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f461fc9a-2ced-499e-a8a3-ab129c298ea7-secret-volume\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.502573 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f461fc9a-2ced-499e-a8a3-ab129c298ea7-config-volume\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.503940 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f461fc9a-2ced-499e-a8a3-ab129c298ea7-config-volume\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.508919 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f461fc9a-2ced-499e-a8a3-ab129c298ea7-secret-volume\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.520218 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk42d\" (UniqueName: \"kubernetes.io/projected/f461fc9a-2ced-499e-a8a3-ab129c298ea7-kube-api-access-kk42d\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.574910 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.933601 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553900-h76nw"] Mar 11 13:00:01 crc kubenswrapper[4816]: I0311 13:00:01.039561 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq"] Mar 11 13:00:01 crc kubenswrapper[4816]: W0311 13:00:01.040740 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf461fc9a_2ced_499e_a8a3_ab129c298ea7.slice/crio-486aaa9f185ddfc81cdba8d3b57cd9ce1b56ffa613bf5a5e3b8dd6e1a0bf75e9 WatchSource:0}: Error finding container 486aaa9f185ddfc81cdba8d3b57cd9ce1b56ffa613bf5a5e3b8dd6e1a0bf75e9: Status 404 returned error can't find the container with id 486aaa9f185ddfc81cdba8d3b57cd9ce1b56ffa613bf5a5e3b8dd6e1a0bf75e9 Mar 11 13:00:01 crc kubenswrapper[4816]: I0311 13:00:01.224298 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553900-h76nw" event={"ID":"9d463a78-830d-4b86-830a-e70345993927","Type":"ContainerStarted","Data":"5d2ed3071d3084d6bd7e9cb365ae82c2e8219d1bd89320437712935bcd615238"} Mar 11 13:00:01 crc kubenswrapper[4816]: I0311 13:00:01.226948 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" event={"ID":"f461fc9a-2ced-499e-a8a3-ab129c298ea7","Type":"ContainerStarted","Data":"5710219c402e66dc0b9662cdba2a41be288b420909be4715c88b70adba89aff6"} Mar 11 13:00:01 crc kubenswrapper[4816]: I0311 13:00:01.227007 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" event={"ID":"f461fc9a-2ced-499e-a8a3-ab129c298ea7","Type":"ContainerStarted","Data":"486aaa9f185ddfc81cdba8d3b57cd9ce1b56ffa613bf5a5e3b8dd6e1a0bf75e9"} Mar 11 13:00:01 crc kubenswrapper[4816]: I0311 13:00:01.247490 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" podStartSLOduration=1.247464675 podStartE2EDuration="1.247464675s" podCreationTimestamp="2026-03-11 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 13:00:01.242981507 +0000 UTC m=+3687.834245474" watchObservedRunningTime="2026-03-11 13:00:01.247464675 +0000 UTC m=+3687.838728642" Mar 11 13:00:02 crc kubenswrapper[4816]: I0311 13:00:02.237466 4816 generic.go:334] "Generic (PLEG): container finished" podID="f461fc9a-2ced-499e-a8a3-ab129c298ea7" containerID="5710219c402e66dc0b9662cdba2a41be288b420909be4715c88b70adba89aff6" exitCode=0 Mar 11 13:00:02 crc kubenswrapper[4816]: I0311 13:00:02.237529 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" event={"ID":"f461fc9a-2ced-499e-a8a3-ab129c298ea7","Type":"ContainerDied","Data":"5710219c402e66dc0b9662cdba2a41be288b420909be4715c88b70adba89aff6"} Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.553933 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.657195 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk42d\" (UniqueName: \"kubernetes.io/projected/f461fc9a-2ced-499e-a8a3-ab129c298ea7-kube-api-access-kk42d\") pod \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.657365 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f461fc9a-2ced-499e-a8a3-ab129c298ea7-config-volume\") pod \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.657445 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f461fc9a-2ced-499e-a8a3-ab129c298ea7-secret-volume\") pod \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.658458 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f461fc9a-2ced-499e-a8a3-ab129c298ea7-config-volume" (OuterVolumeSpecName: "config-volume") pod "f461fc9a-2ced-499e-a8a3-ab129c298ea7" (UID: "f461fc9a-2ced-499e-a8a3-ab129c298ea7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.664448 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f461fc9a-2ced-499e-a8a3-ab129c298ea7-kube-api-access-kk42d" (OuterVolumeSpecName: "kube-api-access-kk42d") pod "f461fc9a-2ced-499e-a8a3-ab129c298ea7" (UID: "f461fc9a-2ced-499e-a8a3-ab129c298ea7"). InnerVolumeSpecName "kube-api-access-kk42d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.664437 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f461fc9a-2ced-499e-a8a3-ab129c298ea7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f461fc9a-2ced-499e-a8a3-ab129c298ea7" (UID: "f461fc9a-2ced-499e-a8a3-ab129c298ea7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.759392 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f461fc9a-2ced-499e-a8a3-ab129c298ea7-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.759437 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk42d\" (UniqueName: \"kubernetes.io/projected/f461fc9a-2ced-499e-a8a3-ab129c298ea7-kube-api-access-kk42d\") on node \"crc\" DevicePath \"\"" Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.759450 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f461fc9a-2ced-499e-a8a3-ab129c298ea7-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 13:00:04 crc kubenswrapper[4816]: I0311 13:00:04.255642 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" event={"ID":"f461fc9a-2ced-499e-a8a3-ab129c298ea7","Type":"ContainerDied","Data":"486aaa9f185ddfc81cdba8d3b57cd9ce1b56ffa613bf5a5e3b8dd6e1a0bf75e9"} Mar 11 13:00:04 crc kubenswrapper[4816]: I0311 13:00:04.255707 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="486aaa9f185ddfc81cdba8d3b57cd9ce1b56ffa613bf5a5e3b8dd6e1a0bf75e9" Mar 11 13:00:04 crc kubenswrapper[4816]: I0311 13:00:04.255714 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:04 crc kubenswrapper[4816]: I0311 13:00:04.321324 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr"] Mar 11 13:00:04 crc kubenswrapper[4816]: I0311 13:00:04.326443 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr"] Mar 11 13:00:05 crc kubenswrapper[4816]: I0311 13:00:05.265833 4816 generic.go:334] "Generic (PLEG): container finished" podID="9d463a78-830d-4b86-830a-e70345993927" containerID="02f79ceb28719ec9aa00f051068012e5f7850ccf8b02f5d8f4ecbb73c01a94f5" exitCode=0 Mar 11 13:00:05 crc kubenswrapper[4816]: I0311 13:00:05.265886 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553900-h76nw" event={"ID":"9d463a78-830d-4b86-830a-e70345993927","Type":"ContainerDied","Data":"02f79ceb28719ec9aa00f051068012e5f7850ccf8b02f5d8f4ecbb73c01a94f5"} Mar 11 13:00:06 crc kubenswrapper[4816]: I0311 13:00:06.136389 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:00:06 crc kubenswrapper[4816]: E0311 13:00:06.136640 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:00:06 crc kubenswrapper[4816]: I0311 13:00:06.169711 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a876e965-7c6d-4773-9c9b-f445411c559b" path="/var/lib/kubelet/pods/a876e965-7c6d-4773-9c9b-f445411c559b/volumes" Mar 11 13:00:06 crc kubenswrapper[4816]: I0311 13:00:06.527078 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553900-h76nw" Mar 11 13:00:06 crc kubenswrapper[4816]: I0311 13:00:06.706932 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9klbz\" (UniqueName: \"kubernetes.io/projected/9d463a78-830d-4b86-830a-e70345993927-kube-api-access-9klbz\") pod \"9d463a78-830d-4b86-830a-e70345993927\" (UID: \"9d463a78-830d-4b86-830a-e70345993927\") " Mar 11 13:00:06 crc kubenswrapper[4816]: I0311 13:00:06.712030 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d463a78-830d-4b86-830a-e70345993927-kube-api-access-9klbz" (OuterVolumeSpecName: "kube-api-access-9klbz") pod "9d463a78-830d-4b86-830a-e70345993927" (UID: "9d463a78-830d-4b86-830a-e70345993927"). InnerVolumeSpecName "kube-api-access-9klbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:00:06 crc kubenswrapper[4816]: I0311 13:00:06.809073 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9klbz\" (UniqueName: \"kubernetes.io/projected/9d463a78-830d-4b86-830a-e70345993927-kube-api-access-9klbz\") on node \"crc\" DevicePath \"\"" Mar 11 13:00:07 crc kubenswrapper[4816]: I0311 13:00:07.283603 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553900-h76nw" event={"ID":"9d463a78-830d-4b86-830a-e70345993927","Type":"ContainerDied","Data":"5d2ed3071d3084d6bd7e9cb365ae82c2e8219d1bd89320437712935bcd615238"} Mar 11 13:00:07 crc kubenswrapper[4816]: I0311 13:00:07.283684 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d2ed3071d3084d6bd7e9cb365ae82c2e8219d1bd89320437712935bcd615238" Mar 11 13:00:07 crc kubenswrapper[4816]: I0311 13:00:07.284201 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553900-h76nw" Mar 11 13:00:07 crc kubenswrapper[4816]: I0311 13:00:07.582830 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553894-48d29"] Mar 11 13:00:07 crc kubenswrapper[4816]: I0311 13:00:07.587897 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553894-48d29"] Mar 11 13:00:08 crc kubenswrapper[4816]: I0311 13:00:08.142579 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca38c432-abfd-4e1c-8ea7-a0781390bb1d" path="/var/lib/kubelet/pods/ca38c432-abfd-4e1c-8ea7-a0781390bb1d/volumes" Mar 11 13:00:17 crc kubenswrapper[4816]: I0311 13:00:17.130624 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:00:17 crc kubenswrapper[4816]: E0311 13:00:17.131390 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:00:31 crc kubenswrapper[4816]: I0311 13:00:31.132016 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:00:31 crc kubenswrapper[4816]: E0311 13:00:31.135277 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:00:44 crc kubenswrapper[4816]: I0311 13:00:44.135119 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:00:44 crc kubenswrapper[4816]: E0311 13:00:44.136394 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:00:49 crc kubenswrapper[4816]: I0311 13:00:49.121558 4816 scope.go:117] "RemoveContainer" containerID="f55a9848386a64adca827b95cdc172bd623f9f4d2757b50c73cba6bd74ab25e2" Mar 11 13:00:49 crc kubenswrapper[4816]: I0311 13:00:49.151613 4816 scope.go:117] "RemoveContainer" containerID="f4e7fa686a33e8ded3e5e43526bdf4db23b8a91e490e0b84842982390eec6764" Mar 11 13:00:59 crc kubenswrapper[4816]: I0311 13:00:59.130919 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:00:59 crc kubenswrapper[4816]: E0311 13:00:59.132151 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:01:14 crc kubenswrapper[4816]: I0311 13:01:14.137799 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:01:14 crc kubenswrapper[4816]: E0311 13:01:14.138660 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:01:29 crc kubenswrapper[4816]: I0311 13:01:29.130228 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:01:29 crc kubenswrapper[4816]: E0311 13:01:29.131060 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:01:44 crc kubenswrapper[4816]: I0311 13:01:44.139808 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:01:44 crc kubenswrapper[4816]: E0311 13:01:44.141031 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:01:56 crc kubenswrapper[4816]: I0311 13:01:56.131382 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:01:56 crc kubenswrapper[4816]: E0311 13:01:56.133933 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.178746 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553902-lffcf"] Mar 11 13:02:00 crc kubenswrapper[4816]: E0311 13:02:00.179858 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f461fc9a-2ced-499e-a8a3-ab129c298ea7" containerName="collect-profiles" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.179883 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f461fc9a-2ced-499e-a8a3-ab129c298ea7" containerName="collect-profiles" Mar 11 13:02:00 crc kubenswrapper[4816]: E0311 13:02:00.179917 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d463a78-830d-4b86-830a-e70345993927" containerName="oc" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.179925 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d463a78-830d-4b86-830a-e70345993927" containerName="oc" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.180100 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f461fc9a-2ced-499e-a8a3-ab129c298ea7" containerName="collect-profiles" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.180117 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d463a78-830d-4b86-830a-e70345993927" containerName="oc" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.180808 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553902-lffcf" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.183948 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.183954 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.185701 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.211081 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553902-lffcf"] Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.252268 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zndsc\" (UniqueName: \"kubernetes.io/projected/6a325766-41a7-415f-88ad-698627f015c1-kube-api-access-zndsc\") pod \"auto-csr-approver-29553902-lffcf\" (UID: \"6a325766-41a7-415f-88ad-698627f015c1\") " pod="openshift-infra/auto-csr-approver-29553902-lffcf" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.354548 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zndsc\" (UniqueName: \"kubernetes.io/projected/6a325766-41a7-415f-88ad-698627f015c1-kube-api-access-zndsc\") pod \"auto-csr-approver-29553902-lffcf\" (UID: \"6a325766-41a7-415f-88ad-698627f015c1\") " pod="openshift-infra/auto-csr-approver-29553902-lffcf" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.374494 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zndsc\" (UniqueName: \"kubernetes.io/projected/6a325766-41a7-415f-88ad-698627f015c1-kube-api-access-zndsc\") pod \"auto-csr-approver-29553902-lffcf\" (UID: \"6a325766-41a7-415f-88ad-698627f015c1\") " pod="openshift-infra/auto-csr-approver-29553902-lffcf" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.507316 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553902-lffcf" Mar 11 13:02:01 crc kubenswrapper[4816]: I0311 13:02:01.003347 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553902-lffcf"] Mar 11 13:02:01 crc kubenswrapper[4816]: I0311 13:02:01.525704 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553902-lffcf" event={"ID":"6a325766-41a7-415f-88ad-698627f015c1","Type":"ContainerStarted","Data":"9ea757d1b65436ff49ee171bceb32375f57ad532f35a8fdb81e882f520694379"} Mar 11 13:02:02 crc kubenswrapper[4816]: I0311 13:02:02.539084 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553902-lffcf" event={"ID":"6a325766-41a7-415f-88ad-698627f015c1","Type":"ContainerStarted","Data":"6b272c0cd2cf4fb57145d2f34bc9f76d7316747da7af06ee61d93b20ed09cce5"} Mar 11 13:02:03 crc kubenswrapper[4816]: I0311 13:02:03.555456 4816 generic.go:334] "Generic (PLEG): container finished" podID="6a325766-41a7-415f-88ad-698627f015c1" containerID="6b272c0cd2cf4fb57145d2f34bc9f76d7316747da7af06ee61d93b20ed09cce5" exitCode=0 Mar 11 13:02:03 crc kubenswrapper[4816]: I0311 13:02:03.555555 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553902-lffcf" event={"ID":"6a325766-41a7-415f-88ad-698627f015c1","Type":"ContainerDied","Data":"6b272c0cd2cf4fb57145d2f34bc9f76d7316747da7af06ee61d93b20ed09cce5"} Mar 11 13:02:03 crc kubenswrapper[4816]: I0311 13:02:03.997333 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553902-lffcf" Mar 11 13:02:04 crc kubenswrapper[4816]: I0311 13:02:04.125486 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zndsc\" (UniqueName: \"kubernetes.io/projected/6a325766-41a7-415f-88ad-698627f015c1-kube-api-access-zndsc\") pod \"6a325766-41a7-415f-88ad-698627f015c1\" (UID: \"6a325766-41a7-415f-88ad-698627f015c1\") " Mar 11 13:02:04 crc kubenswrapper[4816]: I0311 13:02:04.137393 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a325766-41a7-415f-88ad-698627f015c1-kube-api-access-zndsc" (OuterVolumeSpecName: "kube-api-access-zndsc") pod "6a325766-41a7-415f-88ad-698627f015c1" (UID: "6a325766-41a7-415f-88ad-698627f015c1"). InnerVolumeSpecName "kube-api-access-zndsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:02:04 crc kubenswrapper[4816]: I0311 13:02:04.227393 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zndsc\" (UniqueName: \"kubernetes.io/projected/6a325766-41a7-415f-88ad-698627f015c1-kube-api-access-zndsc\") on node \"crc\" DevicePath \"\"" Mar 11 13:02:04 crc kubenswrapper[4816]: I0311 13:02:04.568063 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553902-lffcf" event={"ID":"6a325766-41a7-415f-88ad-698627f015c1","Type":"ContainerDied","Data":"9ea757d1b65436ff49ee171bceb32375f57ad532f35a8fdb81e882f520694379"} Mar 11 13:02:04 crc kubenswrapper[4816]: I0311 13:02:04.568131 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ea757d1b65436ff49ee171bceb32375f57ad532f35a8fdb81e882f520694379" Mar 11 13:02:04 crc kubenswrapper[4816]: I0311 13:02:04.568132 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553902-lffcf" Mar 11 13:02:05 crc kubenswrapper[4816]: I0311 13:02:05.088802 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553896-lxt87"] Mar 11 13:02:05 crc kubenswrapper[4816]: I0311 13:02:05.097275 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553896-lxt87"] Mar 11 13:02:06 crc kubenswrapper[4816]: I0311 13:02:06.141048 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2438ebe2-3bab-42fc-9430-8b2600a2efd1" path="/var/lib/kubelet/pods/2438ebe2-3bab-42fc-9430-8b2600a2efd1/volumes" Mar 11 13:02:07 crc kubenswrapper[4816]: I0311 13:02:07.130287 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:02:07 crc kubenswrapper[4816]: E0311 13:02:07.130702 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:02:21 crc kubenswrapper[4816]: I0311 13:02:21.131136 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:02:21 crc kubenswrapper[4816]: E0311 13:02:21.132479 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:02:34 crc kubenswrapper[4816]: I0311 13:02:34.135972 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:02:34 crc kubenswrapper[4816]: E0311 13:02:34.138971 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:02:46 crc kubenswrapper[4816]: I0311 13:02:46.131156 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:02:46 crc kubenswrapper[4816]: E0311 13:02:46.132593 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:02:49 crc kubenswrapper[4816]: I0311 13:02:49.302225 4816 scope.go:117] "RemoveContainer" containerID="78e7b5f4d85a6eb5009a8b4bbf6ee9389d9e1a205f3bf787bb243af2af671b70" Mar 11 13:02:57 crc kubenswrapper[4816]: I0311 13:02:57.137057 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:02:57 crc kubenswrapper[4816]: E0311 13:02:57.137880 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:03:09 crc kubenswrapper[4816]: I0311 13:03:09.130513 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:03:09 crc kubenswrapper[4816]: E0311 13:03:09.131910 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:03:23 crc kubenswrapper[4816]: I0311 13:03:23.130781 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:03:23 crc kubenswrapper[4816]: E0311 13:03:23.132293 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.105572 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hmmvj"] Mar 11 13:03:35 crc kubenswrapper[4816]: E0311 13:03:35.106583 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a325766-41a7-415f-88ad-698627f015c1" containerName="oc" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.106596 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a325766-41a7-415f-88ad-698627f015c1" containerName="oc" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.106801 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a325766-41a7-415f-88ad-698627f015c1" containerName="oc" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.108811 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.129909 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hmmvj"] Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.279567 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-catalog-content\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.279639 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-utilities\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.279896 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtgx8\" (UniqueName: \"kubernetes.io/projected/81dece66-cc39-4d85-b338-fe3626c87bff-kube-api-access-vtgx8\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.381750 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtgx8\" (UniqueName: \"kubernetes.io/projected/81dece66-cc39-4d85-b338-fe3626c87bff-kube-api-access-vtgx8\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.381859 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-catalog-content\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.381902 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-utilities\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.382509 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-utilities\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.382569 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-catalog-content\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.403614 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtgx8\" (UniqueName: \"kubernetes.io/projected/81dece66-cc39-4d85-b338-fe3626c87bff-kube-api-access-vtgx8\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.452423 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.944294 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hmmvj"] Mar 11 13:03:36 crc kubenswrapper[4816]: I0311 13:03:36.490744 4816 generic.go:334] "Generic (PLEG): container finished" podID="81dece66-cc39-4d85-b338-fe3626c87bff" containerID="8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43" exitCode=0 Mar 11 13:03:36 crc kubenswrapper[4816]: I0311 13:03:36.490865 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmmvj" event={"ID":"81dece66-cc39-4d85-b338-fe3626c87bff","Type":"ContainerDied","Data":"8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43"} Mar 11 13:03:36 crc kubenswrapper[4816]: I0311 13:03:36.493336 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmmvj" event={"ID":"81dece66-cc39-4d85-b338-fe3626c87bff","Type":"ContainerStarted","Data":"9deb2f31d1966f9a8f198c744b69afa0993e2b39aacd3fe0626d5fd630fed132"} Mar 11 13:03:36 crc kubenswrapper[4816]: I0311 13:03:36.493996 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 13:03:38 crc kubenswrapper[4816]: I0311 13:03:38.131582 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:03:38 crc kubenswrapper[4816]: E0311 13:03:38.132225 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:03:38 crc kubenswrapper[4816]: I0311 13:03:38.525127 4816 generic.go:334] "Generic (PLEG): container finished" podID="81dece66-cc39-4d85-b338-fe3626c87bff" containerID="da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb" exitCode=0 Mar 11 13:03:38 crc kubenswrapper[4816]: I0311 13:03:38.525267 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmmvj" event={"ID":"81dece66-cc39-4d85-b338-fe3626c87bff","Type":"ContainerDied","Data":"da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb"} Mar 11 13:03:39 crc kubenswrapper[4816]: I0311 13:03:39.538663 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmmvj" event={"ID":"81dece66-cc39-4d85-b338-fe3626c87bff","Type":"ContainerStarted","Data":"dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972"} Mar 11 13:03:39 crc kubenswrapper[4816]: I0311 13:03:39.569790 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hmmvj" podStartSLOduration=1.934257251 podStartE2EDuration="4.569766334s" podCreationTimestamp="2026-03-11 13:03:35 +0000 UTC" firstStartedPulling="2026-03-11 13:03:36.493425315 +0000 UTC m=+3903.084689322" lastFinishedPulling="2026-03-11 13:03:39.128934398 +0000 UTC m=+3905.720198405" observedRunningTime="2026-03-11 13:03:39.56821444 +0000 UTC m=+3906.159478447" watchObservedRunningTime="2026-03-11 13:03:39.569766334 +0000 UTC m=+3906.161030321" Mar 11 13:03:45 crc kubenswrapper[4816]: I0311 13:03:45.453137 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:45 crc kubenswrapper[4816]: I0311 13:03:45.454449 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:45 crc kubenswrapper[4816]: I0311 13:03:45.534347 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:45 crc kubenswrapper[4816]: I0311 13:03:45.673607 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:45 crc kubenswrapper[4816]: I0311 13:03:45.787285 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hmmvj"] Mar 11 13:03:47 crc kubenswrapper[4816]: I0311 13:03:47.639455 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hmmvj" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="registry-server" containerID="cri-o://dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972" gracePeriod=2 Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.218686 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.236928 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-utilities\") pod \"81dece66-cc39-4d85-b338-fe3626c87bff\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.237057 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtgx8\" (UniqueName: \"kubernetes.io/projected/81dece66-cc39-4d85-b338-fe3626c87bff-kube-api-access-vtgx8\") pod \"81dece66-cc39-4d85-b338-fe3626c87bff\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.237223 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-catalog-content\") pod \"81dece66-cc39-4d85-b338-fe3626c87bff\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.238332 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-utilities" (OuterVolumeSpecName: "utilities") pod "81dece66-cc39-4d85-b338-fe3626c87bff" (UID: "81dece66-cc39-4d85-b338-fe3626c87bff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.251742 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81dece66-cc39-4d85-b338-fe3626c87bff-kube-api-access-vtgx8" (OuterVolumeSpecName: "kube-api-access-vtgx8") pod "81dece66-cc39-4d85-b338-fe3626c87bff" (UID: "81dece66-cc39-4d85-b338-fe3626c87bff"). InnerVolumeSpecName "kube-api-access-vtgx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.339350 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.339388 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtgx8\" (UniqueName: \"kubernetes.io/projected/81dece66-cc39-4d85-b338-fe3626c87bff-kube-api-access-vtgx8\") on node \"crc\" DevicePath \"\"" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.339541 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81dece66-cc39-4d85-b338-fe3626c87bff" (UID: "81dece66-cc39-4d85-b338-fe3626c87bff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.441493 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.654539 4816 generic.go:334] "Generic (PLEG): container finished" podID="81dece66-cc39-4d85-b338-fe3626c87bff" containerID="dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972" exitCode=0 Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.654634 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmmvj" event={"ID":"81dece66-cc39-4d85-b338-fe3626c87bff","Type":"ContainerDied","Data":"dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972"} Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.654660 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.654694 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmmvj" event={"ID":"81dece66-cc39-4d85-b338-fe3626c87bff","Type":"ContainerDied","Data":"9deb2f31d1966f9a8f198c744b69afa0993e2b39aacd3fe0626d5fd630fed132"} Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.654739 4816 scope.go:117] "RemoveContainer" containerID="dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.702614 4816 scope.go:117] "RemoveContainer" containerID="da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.729028 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hmmvj"] Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.734961 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hmmvj"] Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.739921 4816 scope.go:117] "RemoveContainer" containerID="8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.767311 4816 scope.go:117] "RemoveContainer" containerID="dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972" Mar 11 13:03:48 crc kubenswrapper[4816]: E0311 13:03:48.768100 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972\": container with ID starting with dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972 not found: ID does not exist" containerID="dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.768206 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972"} err="failed to get container status \"dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972\": rpc error: code = NotFound desc = could not find container \"dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972\": container with ID starting with dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972 not found: ID does not exist" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.768292 4816 scope.go:117] "RemoveContainer" containerID="da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb" Mar 11 13:03:48 crc kubenswrapper[4816]: E0311 13:03:48.768998 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb\": container with ID starting with da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb not found: ID does not exist" containerID="da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.769063 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb"} err="failed to get container status \"da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb\": rpc error: code = NotFound desc = could not find container \"da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb\": container with ID starting with da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb not found: ID does not exist" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.769114 4816 scope.go:117] "RemoveContainer" containerID="8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43" Mar 11 13:03:48 crc kubenswrapper[4816]: E0311 13:03:48.769648 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43\": container with ID starting with 8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43 not found: ID does not exist" containerID="8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.769682 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43"} err="failed to get container status \"8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43\": rpc error: code = NotFound desc = could not find container \"8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43\": container with ID starting with 8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43 not found: ID does not exist" Mar 11 13:03:50 crc kubenswrapper[4816]: I0311 13:03:50.149105 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" path="/var/lib/kubelet/pods/81dece66-cc39-4d85-b338-fe3626c87bff/volumes" Mar 11 13:03:52 crc kubenswrapper[4816]: I0311 13:03:52.131705 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:03:52 crc kubenswrapper[4816]: E0311 13:03:52.132591 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.184766 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553904-2bx7k"] Mar 11 13:04:00 crc kubenswrapper[4816]: E0311 13:04:00.186411 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="extract-utilities" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.186440 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="extract-utilities" Mar 11 13:04:00 crc kubenswrapper[4816]: E0311 13:04:00.186482 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="registry-server" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.186498 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="registry-server" Mar 11 13:04:00 crc kubenswrapper[4816]: E0311 13:04:00.186541 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="extract-content" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.186556 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="extract-content" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.186826 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="registry-server" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.187758 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553904-2bx7k" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.191399 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.191493 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.192954 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.199678 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553904-2bx7k"] Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.259296 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkl7d\" (UniqueName: \"kubernetes.io/projected/f176ec9f-47de-4710-a5f3-078403bb4bfb-kube-api-access-tkl7d\") pod \"auto-csr-approver-29553904-2bx7k\" (UID: \"f176ec9f-47de-4710-a5f3-078403bb4bfb\") " pod="openshift-infra/auto-csr-approver-29553904-2bx7k" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.364417 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkl7d\" (UniqueName: \"kubernetes.io/projected/f176ec9f-47de-4710-a5f3-078403bb4bfb-kube-api-access-tkl7d\") pod \"auto-csr-approver-29553904-2bx7k\" (UID: \"f176ec9f-47de-4710-a5f3-078403bb4bfb\") " pod="openshift-infra/auto-csr-approver-29553904-2bx7k" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.400353 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkl7d\" (UniqueName: \"kubernetes.io/projected/f176ec9f-47de-4710-a5f3-078403bb4bfb-kube-api-access-tkl7d\") pod \"auto-csr-approver-29553904-2bx7k\" (UID: \"f176ec9f-47de-4710-a5f3-078403bb4bfb\") " pod="openshift-infra/auto-csr-approver-29553904-2bx7k" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.522916 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553904-2bx7k" Mar 11 13:04:01 crc kubenswrapper[4816]: I0311 13:04:01.136537 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553904-2bx7k"] Mar 11 13:04:01 crc kubenswrapper[4816]: I0311 13:04:01.799741 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553904-2bx7k" event={"ID":"f176ec9f-47de-4710-a5f3-078403bb4bfb","Type":"ContainerStarted","Data":"bf633af1b6ecc571dcc5fcee156c69d430901deaca33b2db83e0d83c262bd147"} Mar 11 13:04:02 crc kubenswrapper[4816]: I0311 13:04:02.811113 4816 generic.go:334] "Generic (PLEG): container finished" podID="f176ec9f-47de-4710-a5f3-078403bb4bfb" containerID="c0c120d96d0731c58ebb4a66094eed03724800f299fa6a22258f239a945115e0" exitCode=0 Mar 11 13:04:02 crc kubenswrapper[4816]: I0311 13:04:02.811329 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553904-2bx7k" event={"ID":"f176ec9f-47de-4710-a5f3-078403bb4bfb","Type":"ContainerDied","Data":"c0c120d96d0731c58ebb4a66094eed03724800f299fa6a22258f239a945115e0"} Mar 11 13:04:04 crc kubenswrapper[4816]: I0311 13:04:04.287958 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553904-2bx7k" Mar 11 13:04:04 crc kubenswrapper[4816]: I0311 13:04:04.440089 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkl7d\" (UniqueName: \"kubernetes.io/projected/f176ec9f-47de-4710-a5f3-078403bb4bfb-kube-api-access-tkl7d\") pod \"f176ec9f-47de-4710-a5f3-078403bb4bfb\" (UID: \"f176ec9f-47de-4710-a5f3-078403bb4bfb\") " Mar 11 13:04:04 crc kubenswrapper[4816]: I0311 13:04:04.450369 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f176ec9f-47de-4710-a5f3-078403bb4bfb-kube-api-access-tkl7d" (OuterVolumeSpecName: "kube-api-access-tkl7d") pod "f176ec9f-47de-4710-a5f3-078403bb4bfb" (UID: "f176ec9f-47de-4710-a5f3-078403bb4bfb"). InnerVolumeSpecName "kube-api-access-tkl7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:04:04 crc kubenswrapper[4816]: I0311 13:04:04.542458 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkl7d\" (UniqueName: \"kubernetes.io/projected/f176ec9f-47de-4710-a5f3-078403bb4bfb-kube-api-access-tkl7d\") on node \"crc\" DevicePath \"\"" Mar 11 13:04:04 crc kubenswrapper[4816]: I0311 13:04:04.839429 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553904-2bx7k" event={"ID":"f176ec9f-47de-4710-a5f3-078403bb4bfb","Type":"ContainerDied","Data":"bf633af1b6ecc571dcc5fcee156c69d430901deaca33b2db83e0d83c262bd147"} Mar 11 13:04:04 crc kubenswrapper[4816]: I0311 13:04:04.839506 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf633af1b6ecc571dcc5fcee156c69d430901deaca33b2db83e0d83c262bd147" Mar 11 13:04:04 crc kubenswrapper[4816]: I0311 13:04:04.839607 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553904-2bx7k" Mar 11 13:04:05 crc kubenswrapper[4816]: I0311 13:04:05.385746 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553898-tcpck"] Mar 11 13:04:05 crc kubenswrapper[4816]: I0311 13:04:05.397330 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553898-tcpck"] Mar 11 13:04:06 crc kubenswrapper[4816]: I0311 13:04:06.131188 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:04:06 crc kubenswrapper[4816]: E0311 13:04:06.131788 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:04:06 crc kubenswrapper[4816]: I0311 13:04:06.154954 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f75061-6a64-4c1d-b9f9-77f6425ad4c5" path="/var/lib/kubelet/pods/30f75061-6a64-4c1d-b9f9-77f6425ad4c5/volumes" Mar 11 13:04:18 crc kubenswrapper[4816]: I0311 13:04:18.130631 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:04:18 crc kubenswrapper[4816]: E0311 13:04:18.131642 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:04:33 crc kubenswrapper[4816]: I0311 13:04:33.132345 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:04:33 crc kubenswrapper[4816]: E0311 13:04:33.133631 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:04:47 crc kubenswrapper[4816]: I0311 13:04:47.130406 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:04:48 crc kubenswrapper[4816]: I0311 13:04:48.294015 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"93c58f402ba486c6c006c97994ab202bfd22495eff9729c60fbdfcbe918d3c5f"} Mar 11 13:04:49 crc kubenswrapper[4816]: I0311 13:04:49.419846 4816 scope.go:117] "RemoveContainer" containerID="0c0de876588cbf0205a01555bac817ffc9ad65f6cabe6192282136fce8802326" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.508729 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jtv9r"] Mar 11 13:04:51 crc kubenswrapper[4816]: E0311 13:04:51.512181 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f176ec9f-47de-4710-a5f3-078403bb4bfb" containerName="oc" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.512387 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f176ec9f-47de-4710-a5f3-078403bb4bfb" containerName="oc" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.512777 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f176ec9f-47de-4710-a5f3-078403bb4bfb" containerName="oc" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.514808 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.528986 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtv9r"] Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.553232 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-catalog-content\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.553358 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-utilities\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.553393 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjzsc\" (UniqueName: \"kubernetes.io/projected/52bf5aed-b67d-4fe6-8564-5756e640aa5d-kube-api-access-xjzsc\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.657351 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-utilities\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.657447 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjzsc\" (UniqueName: \"kubernetes.io/projected/52bf5aed-b67d-4fe6-8564-5756e640aa5d-kube-api-access-xjzsc\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.657685 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-catalog-content\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.658018 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-utilities\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.658639 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-catalog-content\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.680508 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjzsc\" (UniqueName: \"kubernetes.io/projected/52bf5aed-b67d-4fe6-8564-5756e640aa5d-kube-api-access-xjzsc\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.853378 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:52 crc kubenswrapper[4816]: I0311 13:04:52.116601 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtv9r"] Mar 11 13:04:53 crc kubenswrapper[4816]: I0311 13:04:53.344691 4816 generic.go:334] "Generic (PLEG): container finished" podID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerID="c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36" exitCode=0 Mar 11 13:04:53 crc kubenswrapper[4816]: I0311 13:04:53.344770 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtv9r" event={"ID":"52bf5aed-b67d-4fe6-8564-5756e640aa5d","Type":"ContainerDied","Data":"c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36"} Mar 11 13:04:53 crc kubenswrapper[4816]: I0311 13:04:53.345175 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtv9r" event={"ID":"52bf5aed-b67d-4fe6-8564-5756e640aa5d","Type":"ContainerStarted","Data":"7dadfe53701c500277eb8a550c826e130375f4f58cab5a05c724a1695adb23b2"} Mar 11 13:04:55 crc kubenswrapper[4816]: I0311 13:04:55.370968 4816 generic.go:334] "Generic (PLEG): container finished" podID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerID="d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569" exitCode=0 Mar 11 13:04:55 crc kubenswrapper[4816]: I0311 13:04:55.371106 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtv9r" event={"ID":"52bf5aed-b67d-4fe6-8564-5756e640aa5d","Type":"ContainerDied","Data":"d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569"} Mar 11 13:04:57 crc kubenswrapper[4816]: I0311 13:04:57.402058 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtv9r" event={"ID":"52bf5aed-b67d-4fe6-8564-5756e640aa5d","Type":"ContainerStarted","Data":"6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7"} Mar 11 13:04:57 crc kubenswrapper[4816]: I0311 13:04:57.436384 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jtv9r" podStartSLOduration=3.170639878 podStartE2EDuration="6.436358033s" podCreationTimestamp="2026-03-11 13:04:51 +0000 UTC" firstStartedPulling="2026-03-11 13:04:53.348159539 +0000 UTC m=+3979.939423506" lastFinishedPulling="2026-03-11 13:04:56.613877664 +0000 UTC m=+3983.205141661" observedRunningTime="2026-03-11 13:04:57.431103372 +0000 UTC m=+3984.022367379" watchObservedRunningTime="2026-03-11 13:04:57.436358033 +0000 UTC m=+3984.027622040" Mar 11 13:05:01 crc kubenswrapper[4816]: I0311 13:05:01.854097 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:05:01 crc kubenswrapper[4816]: I0311 13:05:01.854919 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:05:02 crc kubenswrapper[4816]: I0311 13:05:02.919965 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jtv9r" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="registry-server" probeResult="failure" output=< Mar 11 13:05:02 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 13:05:02 crc kubenswrapper[4816]: > Mar 11 13:05:11 crc kubenswrapper[4816]: I0311 13:05:11.942985 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:05:12 crc kubenswrapper[4816]: I0311 13:05:12.019086 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:05:12 crc kubenswrapper[4816]: I0311 13:05:12.202407 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtv9r"] Mar 11 13:05:13 crc kubenswrapper[4816]: I0311 13:05:13.565150 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jtv9r" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="registry-server" containerID="cri-o://6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7" gracePeriod=2 Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.102930 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.188377 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-catalog-content\") pod \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.188571 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjzsc\" (UniqueName: \"kubernetes.io/projected/52bf5aed-b67d-4fe6-8564-5756e640aa5d-kube-api-access-xjzsc\") pod \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.188744 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-utilities\") pod \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.190264 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-utilities" (OuterVolumeSpecName: "utilities") pod "52bf5aed-b67d-4fe6-8564-5756e640aa5d" (UID: "52bf5aed-b67d-4fe6-8564-5756e640aa5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.200464 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52bf5aed-b67d-4fe6-8564-5756e640aa5d-kube-api-access-xjzsc" (OuterVolumeSpecName: "kube-api-access-xjzsc") pod "52bf5aed-b67d-4fe6-8564-5756e640aa5d" (UID: "52bf5aed-b67d-4fe6-8564-5756e640aa5d"). InnerVolumeSpecName "kube-api-access-xjzsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.292095 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.292189 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjzsc\" (UniqueName: \"kubernetes.io/projected/52bf5aed-b67d-4fe6-8564-5756e640aa5d-kube-api-access-xjzsc\") on node \"crc\" DevicePath \"\"" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.350445 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52bf5aed-b67d-4fe6-8564-5756e640aa5d" (UID: "52bf5aed-b67d-4fe6-8564-5756e640aa5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.394220 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.577268 4816 generic.go:334] "Generic (PLEG): container finished" podID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerID="6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7" exitCode=0 Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.577284 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtv9r" event={"ID":"52bf5aed-b67d-4fe6-8564-5756e640aa5d","Type":"ContainerDied","Data":"6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7"} Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.578330 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtv9r" event={"ID":"52bf5aed-b67d-4fe6-8564-5756e640aa5d","Type":"ContainerDied","Data":"7dadfe53701c500277eb8a550c826e130375f4f58cab5a05c724a1695adb23b2"} Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.577411 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.578398 4816 scope.go:117] "RemoveContainer" containerID="6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.608081 4816 scope.go:117] "RemoveContainer" containerID="d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.620394 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtv9r"] Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.636023 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jtv9r"] Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.645570 4816 scope.go:117] "RemoveContainer" containerID="c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.684012 4816 scope.go:117] "RemoveContainer" containerID="6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7" Mar 11 13:05:14 crc kubenswrapper[4816]: E0311 13:05:14.684722 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7\": container with ID starting with 6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7 not found: ID does not exist" containerID="6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.684805 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7"} err="failed to get container status \"6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7\": rpc error: code = NotFound desc = could not find container \"6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7\": container with ID starting with 6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7 not found: ID does not exist" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.684848 4816 scope.go:117] "RemoveContainer" containerID="d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569" Mar 11 13:05:14 crc kubenswrapper[4816]: E0311 13:05:14.685800 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569\": container with ID starting with d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569 not found: ID does not exist" containerID="d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.685839 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569"} err="failed to get container status \"d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569\": rpc error: code = NotFound desc = could not find container \"d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569\": container with ID starting with d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569 not found: ID does not exist" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.685864 4816 scope.go:117] "RemoveContainer" containerID="c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36" Mar 11 13:05:14 crc kubenswrapper[4816]: E0311 13:05:14.686166 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36\": container with ID starting with c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36 not found: ID does not exist" containerID="c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.686195 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36"} err="failed to get container status \"c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36\": rpc error: code = NotFound desc = could not find container \"c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36\": container with ID starting with c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36 not found: ID does not exist" Mar 11 13:05:16 crc kubenswrapper[4816]: I0311 13:05:16.154105 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" path="/var/lib/kubelet/pods/52bf5aed-b67d-4fe6-8564-5756e640aa5d/volumes" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.164135 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553906-4wxn9"] Mar 11 13:06:00 crc kubenswrapper[4816]: E0311 13:06:00.165091 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="extract-content" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.165107 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="extract-content" Mar 11 13:06:00 crc kubenswrapper[4816]: E0311 13:06:00.165137 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="registry-server" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.165144 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="registry-server" Mar 11 13:06:00 crc kubenswrapper[4816]: E0311 13:06:00.165159 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="extract-utilities" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.165166 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="extract-utilities" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.165332 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="registry-server" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.166020 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553906-4wxn9" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.169184 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.169597 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.169773 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.170966 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553906-4wxn9"] Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.253152 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64vx2\" (UniqueName: \"kubernetes.io/projected/4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf-kube-api-access-64vx2\") pod \"auto-csr-approver-29553906-4wxn9\" (UID: \"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf\") " pod="openshift-infra/auto-csr-approver-29553906-4wxn9" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.354454 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64vx2\" (UniqueName: \"kubernetes.io/projected/4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf-kube-api-access-64vx2\") pod \"auto-csr-approver-29553906-4wxn9\" (UID: \"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf\") " pod="openshift-infra/auto-csr-approver-29553906-4wxn9" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.385468 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64vx2\" (UniqueName: \"kubernetes.io/projected/4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf-kube-api-access-64vx2\") pod \"auto-csr-approver-29553906-4wxn9\" (UID: \"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf\") " pod="openshift-infra/auto-csr-approver-29553906-4wxn9" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.515046 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553906-4wxn9" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.813631 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553906-4wxn9"] Mar 11 13:06:01 crc kubenswrapper[4816]: I0311 13:06:01.647146 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553906-4wxn9" event={"ID":"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf","Type":"ContainerStarted","Data":"3e7f4363ecc635bb29a64b19cb4b42e426db73749bd9c0629fc81823afe7412f"} Mar 11 13:06:02 crc kubenswrapper[4816]: I0311 13:06:02.656496 4816 generic.go:334] "Generic (PLEG): container finished" podID="4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf" containerID="3049a692892071c6574e8ee18347abb47ed4c1ed532d21f9dde8bcb07555460f" exitCode=0 Mar 11 13:06:02 crc kubenswrapper[4816]: I0311 13:06:02.656572 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553906-4wxn9" event={"ID":"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf","Type":"ContainerDied","Data":"3049a692892071c6574e8ee18347abb47ed4c1ed532d21f9dde8bcb07555460f"} Mar 11 13:06:04 crc kubenswrapper[4816]: I0311 13:06:04.009110 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553906-4wxn9" Mar 11 13:06:04 crc kubenswrapper[4816]: I0311 13:06:04.116444 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64vx2\" (UniqueName: \"kubernetes.io/projected/4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf-kube-api-access-64vx2\") pod \"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf\" (UID: \"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf\") " Mar 11 13:06:04 crc kubenswrapper[4816]: I0311 13:06:04.126719 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf-kube-api-access-64vx2" (OuterVolumeSpecName: "kube-api-access-64vx2") pod "4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf" (UID: "4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf"). InnerVolumeSpecName "kube-api-access-64vx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:06:04 crc kubenswrapper[4816]: I0311 13:06:04.219686 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64vx2\" (UniqueName: \"kubernetes.io/projected/4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf-kube-api-access-64vx2\") on node \"crc\" DevicePath \"\"" Mar 11 13:06:04 crc kubenswrapper[4816]: I0311 13:06:04.676687 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553906-4wxn9" event={"ID":"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf","Type":"ContainerDied","Data":"3e7f4363ecc635bb29a64b19cb4b42e426db73749bd9c0629fc81823afe7412f"} Mar 11 13:06:04 crc kubenswrapper[4816]: I0311 13:06:04.676744 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e7f4363ecc635bb29a64b19cb4b42e426db73749bd9c0629fc81823afe7412f" Mar 11 13:06:04 crc kubenswrapper[4816]: I0311 13:06:04.676816 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553906-4wxn9" Mar 11 13:06:05 crc kubenswrapper[4816]: I0311 13:06:05.120552 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553900-h76nw"] Mar 11 13:06:05 crc kubenswrapper[4816]: I0311 13:06:05.133093 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553900-h76nw"] Mar 11 13:06:06 crc kubenswrapper[4816]: I0311 13:06:06.141844 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d463a78-830d-4b86-830a-e70345993927" path="/var/lib/kubelet/pods/9d463a78-830d-4b86-830a-e70345993927/volumes" Mar 11 13:06:49 crc kubenswrapper[4816]: I0311 13:06:49.547770 4816 scope.go:117] "RemoveContainer" containerID="02f79ceb28719ec9aa00f051068012e5f7850ccf8b02f5d8f4ecbb73c01a94f5" Mar 11 13:07:09 crc kubenswrapper[4816]: I0311 13:07:09.514895 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:07:09 crc kubenswrapper[4816]: I0311 13:07:09.515807 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:07:39 crc kubenswrapper[4816]: I0311 13:07:39.515339 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:07:39 crc kubenswrapper[4816]: I0311 13:07:39.516279 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.714914 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sxvrq"] Mar 11 13:07:42 crc kubenswrapper[4816]: E0311 13:07:42.716232 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf" containerName="oc" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.716269 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf" containerName="oc" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.716650 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf" containerName="oc" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.719182 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.728065 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxvrq"] Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.815925 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxqlq\" (UniqueName: \"kubernetes.io/projected/eff46de2-0d75-4bb7-a269-d703a8621c8e-kube-api-access-vxqlq\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.816001 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-utilities\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.817648 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-catalog-content\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.919495 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-catalog-content\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.919591 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxqlq\" (UniqueName: \"kubernetes.io/projected/eff46de2-0d75-4bb7-a269-d703a8621c8e-kube-api-access-vxqlq\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.919642 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-utilities\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.920474 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-utilities\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.920477 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-catalog-content\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.942555 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxqlq\" (UniqueName: \"kubernetes.io/projected/eff46de2-0d75-4bb7-a269-d703a8621c8e-kube-api-access-vxqlq\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:43 crc kubenswrapper[4816]: I0311 13:07:43.061744 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:43 crc kubenswrapper[4816]: I0311 13:07:43.465978 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxvrq"] Mar 11 13:07:43 crc kubenswrapper[4816]: I0311 13:07:43.647371 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxvrq" event={"ID":"eff46de2-0d75-4bb7-a269-d703a8621c8e","Type":"ContainerStarted","Data":"cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3"} Mar 11 13:07:43 crc kubenswrapper[4816]: I0311 13:07:43.647968 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxvrq" event={"ID":"eff46de2-0d75-4bb7-a269-d703a8621c8e","Type":"ContainerStarted","Data":"77d681d03c90044ec1e22b8547e0a03ed16a8ae23c9744a79518e8bb6fc6b1ff"} Mar 11 13:07:44 crc kubenswrapper[4816]: I0311 13:07:44.669885 4816 generic.go:334] "Generic (PLEG): container finished" podID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerID="cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3" exitCode=0 Mar 11 13:07:44 crc kubenswrapper[4816]: I0311 13:07:44.669956 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxvrq" event={"ID":"eff46de2-0d75-4bb7-a269-d703a8621c8e","Type":"ContainerDied","Data":"cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3"} Mar 11 13:07:46 crc kubenswrapper[4816]: I0311 13:07:46.698469 4816 generic.go:334] "Generic (PLEG): container finished" podID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerID="a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869" exitCode=0 Mar 11 13:07:46 crc kubenswrapper[4816]: I0311 13:07:46.698549 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxvrq" event={"ID":"eff46de2-0d75-4bb7-a269-d703a8621c8e","Type":"ContainerDied","Data":"a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869"} Mar 11 13:07:47 crc kubenswrapper[4816]: I0311 13:07:47.711276 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxvrq" event={"ID":"eff46de2-0d75-4bb7-a269-d703a8621c8e","Type":"ContainerStarted","Data":"b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6"} Mar 11 13:07:47 crc kubenswrapper[4816]: I0311 13:07:47.749115 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sxvrq" podStartSLOduration=3.279799417 podStartE2EDuration="5.749080258s" podCreationTimestamp="2026-03-11 13:07:42 +0000 UTC" firstStartedPulling="2026-03-11 13:07:44.675092027 +0000 UTC m=+4151.266356034" lastFinishedPulling="2026-03-11 13:07:47.144372868 +0000 UTC m=+4153.735636875" observedRunningTime="2026-03-11 13:07:47.738561598 +0000 UTC m=+4154.329825595" watchObservedRunningTime="2026-03-11 13:07:47.749080258 +0000 UTC m=+4154.340344235" Mar 11 13:07:53 crc kubenswrapper[4816]: I0311 13:07:53.061930 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:53 crc kubenswrapper[4816]: I0311 13:07:53.063000 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:53 crc kubenswrapper[4816]: I0311 13:07:53.114436 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:53 crc kubenswrapper[4816]: I0311 13:07:53.824087 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:53 crc kubenswrapper[4816]: I0311 13:07:53.895367 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxvrq"] Mar 11 13:07:55 crc kubenswrapper[4816]: I0311 13:07:55.786694 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sxvrq" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="registry-server" containerID="cri-o://b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6" gracePeriod=2 Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.282347 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.450519 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxqlq\" (UniqueName: \"kubernetes.io/projected/eff46de2-0d75-4bb7-a269-d703a8621c8e-kube-api-access-vxqlq\") pod \"eff46de2-0d75-4bb7-a269-d703a8621c8e\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.450587 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-utilities\") pod \"eff46de2-0d75-4bb7-a269-d703a8621c8e\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.450715 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-catalog-content\") pod \"eff46de2-0d75-4bb7-a269-d703a8621c8e\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.451732 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-utilities" (OuterVolumeSpecName: "utilities") pod "eff46de2-0d75-4bb7-a269-d703a8621c8e" (UID: "eff46de2-0d75-4bb7-a269-d703a8621c8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.459122 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff46de2-0d75-4bb7-a269-d703a8621c8e-kube-api-access-vxqlq" (OuterVolumeSpecName: "kube-api-access-vxqlq") pod "eff46de2-0d75-4bb7-a269-d703a8621c8e" (UID: "eff46de2-0d75-4bb7-a269-d703a8621c8e"). InnerVolumeSpecName "kube-api-access-vxqlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.553339 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxqlq\" (UniqueName: \"kubernetes.io/projected/eff46de2-0d75-4bb7-a269-d703a8621c8e-kube-api-access-vxqlq\") on node \"crc\" DevicePath \"\"" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.553838 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.746535 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eff46de2-0d75-4bb7-a269-d703a8621c8e" (UID: "eff46de2-0d75-4bb7-a269-d703a8621c8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.757478 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.803467 4816 generic.go:334] "Generic (PLEG): container finished" podID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerID="b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6" exitCode=0 Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.803619 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.803618 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxvrq" event={"ID":"eff46de2-0d75-4bb7-a269-d703a8621c8e","Type":"ContainerDied","Data":"b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6"} Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.805484 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxvrq" event={"ID":"eff46de2-0d75-4bb7-a269-d703a8621c8e","Type":"ContainerDied","Data":"77d681d03c90044ec1e22b8547e0a03ed16a8ae23c9744a79518e8bb6fc6b1ff"} Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.805540 4816 scope.go:117] "RemoveContainer" containerID="b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.845400 4816 scope.go:117] "RemoveContainer" containerID="a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.881513 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxvrq"] Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.891323 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sxvrq"] Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.894293 4816 scope.go:117] "RemoveContainer" containerID="cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.924522 4816 scope.go:117] "RemoveContainer" containerID="b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6" Mar 11 13:07:56 crc kubenswrapper[4816]: E0311 13:07:56.925803 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6\": container with ID starting with b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6 not found: ID does not exist" containerID="b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.925849 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6"} err="failed to get container status \"b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6\": rpc error: code = NotFound desc = could not find container \"b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6\": container with ID starting with b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6 not found: ID does not exist" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.925884 4816 scope.go:117] "RemoveContainer" containerID="a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869" Mar 11 13:07:56 crc kubenswrapper[4816]: E0311 13:07:56.926658 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869\": container with ID starting with a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869 not found: ID does not exist" containerID="a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.926736 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869"} err="failed to get container status \"a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869\": rpc error: code = NotFound desc = could not find container \"a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869\": container with ID starting with a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869 not found: ID does not exist" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.926790 4816 scope.go:117] "RemoveContainer" containerID="cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3" Mar 11 13:07:56 crc kubenswrapper[4816]: E0311 13:07:56.927471 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3\": container with ID starting with cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3 not found: ID does not exist" containerID="cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.927512 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3"} err="failed to get container status \"cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3\": rpc error: code = NotFound desc = could not find container \"cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3\": container with ID starting with cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3 not found: ID does not exist" Mar 11 13:07:58 crc kubenswrapper[4816]: I0311 13:07:58.148934 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" path="/var/lib/kubelet/pods/eff46de2-0d75-4bb7-a269-d703a8621c8e/volumes" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.173930 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553908-7sdtd"] Mar 11 13:08:00 crc kubenswrapper[4816]: E0311 13:08:00.174985 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="extract-utilities" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.175011 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="extract-utilities" Mar 11 13:08:00 crc kubenswrapper[4816]: E0311 13:08:00.175039 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="registry-server" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.175050 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="registry-server" Mar 11 13:08:00 crc kubenswrapper[4816]: E0311 13:08:00.175071 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="extract-content" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.175083 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="extract-content" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.175369 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="registry-server" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.176174 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553908-7sdtd" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.181996 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.182310 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.182476 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.182124 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553908-7sdtd"] Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.326014 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbmvv\" (UniqueName: \"kubernetes.io/projected/9dc717b0-ae5a-46c4-9fea-48dcab17a9c8-kube-api-access-sbmvv\") pod \"auto-csr-approver-29553908-7sdtd\" (UID: \"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8\") " pod="openshift-infra/auto-csr-approver-29553908-7sdtd" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.428172 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbmvv\" (UniqueName: \"kubernetes.io/projected/9dc717b0-ae5a-46c4-9fea-48dcab17a9c8-kube-api-access-sbmvv\") pod \"auto-csr-approver-29553908-7sdtd\" (UID: \"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8\") " pod="openshift-infra/auto-csr-approver-29553908-7sdtd" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.459970 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbmvv\" (UniqueName: \"kubernetes.io/projected/9dc717b0-ae5a-46c4-9fea-48dcab17a9c8-kube-api-access-sbmvv\") pod \"auto-csr-approver-29553908-7sdtd\" (UID: \"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8\") " pod="openshift-infra/auto-csr-approver-29553908-7sdtd" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.521732 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553908-7sdtd" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.801495 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553908-7sdtd"] Mar 11 13:08:00 crc kubenswrapper[4816]: W0311 13:08:00.812609 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dc717b0_ae5a_46c4_9fea_48dcab17a9c8.slice/crio-4b2a804e3091f1bc1028213099574573fabfe400673c93ed97d0c58eed843d2d WatchSource:0}: Error finding container 4b2a804e3091f1bc1028213099574573fabfe400673c93ed97d0c58eed843d2d: Status 404 returned error can't find the container with id 4b2a804e3091f1bc1028213099574573fabfe400673c93ed97d0c58eed843d2d Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.855459 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553908-7sdtd" event={"ID":"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8","Type":"ContainerStarted","Data":"4b2a804e3091f1bc1028213099574573fabfe400673c93ed97d0c58eed843d2d"} Mar 11 13:08:02 crc kubenswrapper[4816]: I0311 13:08:02.878633 4816 generic.go:334] "Generic (PLEG): container finished" podID="9dc717b0-ae5a-46c4-9fea-48dcab17a9c8" containerID="8f956a9fd47ed00082f55e7f0e6d344e63382a72a835af03fd720051a5e8b801" exitCode=0 Mar 11 13:08:02 crc kubenswrapper[4816]: I0311 13:08:02.878729 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553908-7sdtd" event={"ID":"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8","Type":"ContainerDied","Data":"8f956a9fd47ed00082f55e7f0e6d344e63382a72a835af03fd720051a5e8b801"} Mar 11 13:08:04 crc kubenswrapper[4816]: I0311 13:08:04.352497 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553908-7sdtd" Mar 11 13:08:04 crc kubenswrapper[4816]: I0311 13:08:04.505805 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbmvv\" (UniqueName: \"kubernetes.io/projected/9dc717b0-ae5a-46c4-9fea-48dcab17a9c8-kube-api-access-sbmvv\") pod \"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8\" (UID: \"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8\") " Mar 11 13:08:04 crc kubenswrapper[4816]: I0311 13:08:04.514332 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc717b0-ae5a-46c4-9fea-48dcab17a9c8-kube-api-access-sbmvv" (OuterVolumeSpecName: "kube-api-access-sbmvv") pod "9dc717b0-ae5a-46c4-9fea-48dcab17a9c8" (UID: "9dc717b0-ae5a-46c4-9fea-48dcab17a9c8"). InnerVolumeSpecName "kube-api-access-sbmvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:08:04 crc kubenswrapper[4816]: I0311 13:08:04.608699 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbmvv\" (UniqueName: \"kubernetes.io/projected/9dc717b0-ae5a-46c4-9fea-48dcab17a9c8-kube-api-access-sbmvv\") on node \"crc\" DevicePath \"\"" Mar 11 13:08:04 crc kubenswrapper[4816]: I0311 13:08:04.904340 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553908-7sdtd" event={"ID":"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8","Type":"ContainerDied","Data":"4b2a804e3091f1bc1028213099574573fabfe400673c93ed97d0c58eed843d2d"} Mar 11 13:08:04 crc kubenswrapper[4816]: I0311 13:08:04.904406 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b2a804e3091f1bc1028213099574573fabfe400673c93ed97d0c58eed843d2d" Mar 11 13:08:04 crc kubenswrapper[4816]: I0311 13:08:04.904461 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553908-7sdtd" Mar 11 13:08:05 crc kubenswrapper[4816]: I0311 13:08:05.464172 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553902-lffcf"] Mar 11 13:08:05 crc kubenswrapper[4816]: I0311 13:08:05.476182 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553902-lffcf"] Mar 11 13:08:06 crc kubenswrapper[4816]: I0311 13:08:06.143120 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a325766-41a7-415f-88ad-698627f015c1" path="/var/lib/kubelet/pods/6a325766-41a7-415f-88ad-698627f015c1/volumes" Mar 11 13:08:09 crc kubenswrapper[4816]: I0311 13:08:09.515376 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:08:09 crc kubenswrapper[4816]: I0311 13:08:09.517314 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:08:09 crc kubenswrapper[4816]: I0311 13:08:09.517543 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 13:08:09 crc kubenswrapper[4816]: I0311 13:08:09.518724 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93c58f402ba486c6c006c97994ab202bfd22495eff9729c60fbdfcbe918d3c5f"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 13:08:09 crc kubenswrapper[4816]: I0311 13:08:09.519163 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://93c58f402ba486c6c006c97994ab202bfd22495eff9729c60fbdfcbe918d3c5f" gracePeriod=600 Mar 11 13:08:10 crc kubenswrapper[4816]: I0311 13:08:10.002341 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="93c58f402ba486c6c006c97994ab202bfd22495eff9729c60fbdfcbe918d3c5f" exitCode=0 Mar 11 13:08:10 crc kubenswrapper[4816]: I0311 13:08:10.002465 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"93c58f402ba486c6c006c97994ab202bfd22495eff9729c60fbdfcbe918d3c5f"} Mar 11 13:08:10 crc kubenswrapper[4816]: I0311 13:08:10.002978 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:08:11 crc kubenswrapper[4816]: I0311 13:08:11.022300 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e"} Mar 11 13:08:49 crc kubenswrapper[4816]: I0311 13:08:49.655394 4816 scope.go:117] "RemoveContainer" containerID="6b272c0cd2cf4fb57145d2f34bc9f76d7316747da7af06ee61d93b20ed09cce5" Mar 11 13:09:06 crc kubenswrapper[4816]: I0311 13:09:06.971867 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4vkqz"] Mar 11 13:09:06 crc kubenswrapper[4816]: E0311 13:09:06.972959 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc717b0-ae5a-46c4-9fea-48dcab17a9c8" containerName="oc" Mar 11 13:09:06 crc kubenswrapper[4816]: I0311 13:09:06.972977 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc717b0-ae5a-46c4-9fea-48dcab17a9c8" containerName="oc" Mar 11 13:09:06 crc kubenswrapper[4816]: I0311 13:09:06.973172 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc717b0-ae5a-46c4-9fea-48dcab17a9c8" containerName="oc" Mar 11 13:09:06 crc kubenswrapper[4816]: I0311 13:09:06.974604 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:06 crc kubenswrapper[4816]: I0311 13:09:06.989850 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vkqz"] Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.175749 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-catalog-content\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.175846 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-utilities\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.175880 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4jnt\" (UniqueName: \"kubernetes.io/projected/bc2b0115-a0c2-49f3-b371-41dab6a785d9-kube-api-access-p4jnt\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.277683 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-catalog-content\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.277809 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-utilities\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.277863 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4jnt\" (UniqueName: \"kubernetes.io/projected/bc2b0115-a0c2-49f3-b371-41dab6a785d9-kube-api-access-p4jnt\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.278679 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-catalog-content\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.278738 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-utilities\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.302704 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4jnt\" (UniqueName: \"kubernetes.io/projected/bc2b0115-a0c2-49f3-b371-41dab6a785d9-kube-api-access-p4jnt\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.313776 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.775992 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vkqz"] Mar 11 13:09:08 crc kubenswrapper[4816]: I0311 13:09:08.666500 4816 generic.go:334] "Generic (PLEG): container finished" podID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerID="e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7" exitCode=0 Mar 11 13:09:08 crc kubenswrapper[4816]: I0311 13:09:08.666606 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vkqz" event={"ID":"bc2b0115-a0c2-49f3-b371-41dab6a785d9","Type":"ContainerDied","Data":"e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7"} Mar 11 13:09:08 crc kubenswrapper[4816]: I0311 13:09:08.666986 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vkqz" event={"ID":"bc2b0115-a0c2-49f3-b371-41dab6a785d9","Type":"ContainerStarted","Data":"dfd5f229efde5cc7964f1d4a40897e75908c215ed7813b1fd48bb7c0b8cb6acb"} Mar 11 13:09:08 crc kubenswrapper[4816]: I0311 13:09:08.670968 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 13:09:09 crc kubenswrapper[4816]: I0311 13:09:09.679517 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vkqz" event={"ID":"bc2b0115-a0c2-49f3-b371-41dab6a785d9","Type":"ContainerStarted","Data":"3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a"} Mar 11 13:09:10 crc kubenswrapper[4816]: I0311 13:09:10.710564 4816 generic.go:334] "Generic (PLEG): container finished" podID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerID="3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a" exitCode=0 Mar 11 13:09:10 crc kubenswrapper[4816]: I0311 13:09:10.710661 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vkqz" event={"ID":"bc2b0115-a0c2-49f3-b371-41dab6a785d9","Type":"ContainerDied","Data":"3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a"} Mar 11 13:09:12 crc kubenswrapper[4816]: I0311 13:09:12.734415 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vkqz" event={"ID":"bc2b0115-a0c2-49f3-b371-41dab6a785d9","Type":"ContainerStarted","Data":"58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2"} Mar 11 13:09:12 crc kubenswrapper[4816]: I0311 13:09:12.766427 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4vkqz" podStartSLOduration=4.312527904 podStartE2EDuration="6.766405485s" podCreationTimestamp="2026-03-11 13:09:06 +0000 UTC" firstStartedPulling="2026-03-11 13:09:08.670458024 +0000 UTC m=+4235.261722021" lastFinishedPulling="2026-03-11 13:09:11.124335595 +0000 UTC m=+4237.715599602" observedRunningTime="2026-03-11 13:09:12.757948173 +0000 UTC m=+4239.349212150" watchObservedRunningTime="2026-03-11 13:09:12.766405485 +0000 UTC m=+4239.357669462" Mar 11 13:09:17 crc kubenswrapper[4816]: I0311 13:09:17.314727 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:17 crc kubenswrapper[4816]: I0311 13:09:17.315601 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:17 crc kubenswrapper[4816]: I0311 13:09:17.388387 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:17 crc kubenswrapper[4816]: I0311 13:09:17.865217 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:17 crc kubenswrapper[4816]: I0311 13:09:17.935563 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vkqz"] Mar 11 13:09:19 crc kubenswrapper[4816]: I0311 13:09:19.805305 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4vkqz" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="registry-server" containerID="cri-o://58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2" gracePeriod=2 Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.328016 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.442789 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-utilities\") pod \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.442936 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4jnt\" (UniqueName: \"kubernetes.io/projected/bc2b0115-a0c2-49f3-b371-41dab6a785d9-kube-api-access-p4jnt\") pod \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.443128 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-catalog-content\") pod \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.443994 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-utilities" (OuterVolumeSpecName: "utilities") pod "bc2b0115-a0c2-49f3-b371-41dab6a785d9" (UID: "bc2b0115-a0c2-49f3-b371-41dab6a785d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.453819 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc2b0115-a0c2-49f3-b371-41dab6a785d9-kube-api-access-p4jnt" (OuterVolumeSpecName: "kube-api-access-p4jnt") pod "bc2b0115-a0c2-49f3-b371-41dab6a785d9" (UID: "bc2b0115-a0c2-49f3-b371-41dab6a785d9"). InnerVolumeSpecName "kube-api-access-p4jnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.496577 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc2b0115-a0c2-49f3-b371-41dab6a785d9" (UID: "bc2b0115-a0c2-49f3-b371-41dab6a785d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.546312 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4jnt\" (UniqueName: \"kubernetes.io/projected/bc2b0115-a0c2-49f3-b371-41dab6a785d9-kube-api-access-p4jnt\") on node \"crc\" DevicePath \"\"" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.546644 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.546859 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.819436 4816 generic.go:334] "Generic (PLEG): container finished" podID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerID="58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2" exitCode=0 Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.819522 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vkqz" event={"ID":"bc2b0115-a0c2-49f3-b371-41dab6a785d9","Type":"ContainerDied","Data":"58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2"} Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.820025 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vkqz" event={"ID":"bc2b0115-a0c2-49f3-b371-41dab6a785d9","Type":"ContainerDied","Data":"dfd5f229efde5cc7964f1d4a40897e75908c215ed7813b1fd48bb7c0b8cb6acb"} Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.820062 4816 scope.go:117] "RemoveContainer" containerID="58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.819642 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.879384 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vkqz"] Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.881462 4816 scope.go:117] "RemoveContainer" containerID="3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.887842 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vkqz"] Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.910616 4816 scope.go:117] "RemoveContainer" containerID="e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.954988 4816 scope.go:117] "RemoveContainer" containerID="58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2" Mar 11 13:09:20 crc kubenswrapper[4816]: E0311 13:09:20.955802 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2\": container with ID starting with 58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2 not found: ID does not exist" containerID="58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.955866 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2"} err="failed to get container status \"58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2\": rpc error: code = NotFound desc = could not find container \"58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2\": container with ID starting with 58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2 not found: ID does not exist" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.955909 4816 scope.go:117] "RemoveContainer" containerID="3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a" Mar 11 13:09:20 crc kubenswrapper[4816]: E0311 13:09:20.956631 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a\": container with ID starting with 3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a not found: ID does not exist" containerID="3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.956741 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a"} err="failed to get container status \"3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a\": rpc error: code = NotFound desc = could not find container \"3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a\": container with ID starting with 3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a not found: ID does not exist" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.956771 4816 scope.go:117] "RemoveContainer" containerID="e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7" Mar 11 13:09:20 crc kubenswrapper[4816]: E0311 13:09:20.957406 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7\": container with ID starting with e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7 not found: ID does not exist" containerID="e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.957619 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7"} err="failed to get container status \"e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7\": rpc error: code = NotFound desc = could not find container \"e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7\": container with ID starting with e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7 not found: ID does not exist" Mar 11 13:09:22 crc kubenswrapper[4816]: I0311 13:09:22.150578 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" path="/var/lib/kubelet/pods/bc2b0115-a0c2-49f3-b371-41dab6a785d9/volumes" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.189463 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553910-xrjfk"] Mar 11 13:10:00 crc kubenswrapper[4816]: E0311 13:10:00.191132 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="registry-server" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.191157 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="registry-server" Mar 11 13:10:00 crc kubenswrapper[4816]: E0311 13:10:00.191180 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="extract-utilities" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.191195 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="extract-utilities" Mar 11 13:10:00 crc kubenswrapper[4816]: E0311 13:10:00.191217 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="extract-content" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.191227 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="extract-content" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.191668 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="registry-server" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.192689 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553910-xrjfk" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.197156 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.202984 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.203029 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.204487 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553910-xrjfk"] Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.302879 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kqwr\" (UniqueName: \"kubernetes.io/projected/4ee2e218-36ee-47c0-9bca-f2f6affd5b02-kube-api-access-7kqwr\") pod \"auto-csr-approver-29553910-xrjfk\" (UID: \"4ee2e218-36ee-47c0-9bca-f2f6affd5b02\") " pod="openshift-infra/auto-csr-approver-29553910-xrjfk" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.405050 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kqwr\" (UniqueName: \"kubernetes.io/projected/4ee2e218-36ee-47c0-9bca-f2f6affd5b02-kube-api-access-7kqwr\") pod \"auto-csr-approver-29553910-xrjfk\" (UID: \"4ee2e218-36ee-47c0-9bca-f2f6affd5b02\") " pod="openshift-infra/auto-csr-approver-29553910-xrjfk" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.443562 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kqwr\" (UniqueName: \"kubernetes.io/projected/4ee2e218-36ee-47c0-9bca-f2f6affd5b02-kube-api-access-7kqwr\") pod \"auto-csr-approver-29553910-xrjfk\" (UID: \"4ee2e218-36ee-47c0-9bca-f2f6affd5b02\") " pod="openshift-infra/auto-csr-approver-29553910-xrjfk" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.553062 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553910-xrjfk" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.847808 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553910-xrjfk"] Mar 11 13:10:01 crc kubenswrapper[4816]: I0311 13:10:01.240478 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553910-xrjfk" event={"ID":"4ee2e218-36ee-47c0-9bca-f2f6affd5b02","Type":"ContainerStarted","Data":"e719823490853e7798213dd08945ce63d536796333fa822e405ae33b26d6d66d"} Mar 11 13:10:03 crc kubenswrapper[4816]: I0311 13:10:03.263527 4816 generic.go:334] "Generic (PLEG): container finished" podID="4ee2e218-36ee-47c0-9bca-f2f6affd5b02" containerID="13136e90ba59855de085b0d87fba900a964c210d6db5608d7bd773e44d7b1505" exitCode=0 Mar 11 13:10:03 crc kubenswrapper[4816]: I0311 13:10:03.263613 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553910-xrjfk" event={"ID":"4ee2e218-36ee-47c0-9bca-f2f6affd5b02","Type":"ContainerDied","Data":"13136e90ba59855de085b0d87fba900a964c210d6db5608d7bd773e44d7b1505"} Mar 11 13:10:04 crc kubenswrapper[4816]: I0311 13:10:04.606054 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553910-xrjfk" Mar 11 13:10:04 crc kubenswrapper[4816]: I0311 13:10:04.789674 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kqwr\" (UniqueName: \"kubernetes.io/projected/4ee2e218-36ee-47c0-9bca-f2f6affd5b02-kube-api-access-7kqwr\") pod \"4ee2e218-36ee-47c0-9bca-f2f6affd5b02\" (UID: \"4ee2e218-36ee-47c0-9bca-f2f6affd5b02\") " Mar 11 13:10:04 crc kubenswrapper[4816]: I0311 13:10:04.799314 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee2e218-36ee-47c0-9bca-f2f6affd5b02-kube-api-access-7kqwr" (OuterVolumeSpecName: "kube-api-access-7kqwr") pod "4ee2e218-36ee-47c0-9bca-f2f6affd5b02" (UID: "4ee2e218-36ee-47c0-9bca-f2f6affd5b02"). InnerVolumeSpecName "kube-api-access-7kqwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:10:04 crc kubenswrapper[4816]: I0311 13:10:04.892767 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kqwr\" (UniqueName: \"kubernetes.io/projected/4ee2e218-36ee-47c0-9bca-f2f6affd5b02-kube-api-access-7kqwr\") on node \"crc\" DevicePath \"\"" Mar 11 13:10:05 crc kubenswrapper[4816]: I0311 13:10:05.286927 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553910-xrjfk" event={"ID":"4ee2e218-36ee-47c0-9bca-f2f6affd5b02","Type":"ContainerDied","Data":"e719823490853e7798213dd08945ce63d536796333fa822e405ae33b26d6d66d"} Mar 11 13:10:05 crc kubenswrapper[4816]: I0311 13:10:05.287404 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e719823490853e7798213dd08945ce63d536796333fa822e405ae33b26d6d66d" Mar 11 13:10:05 crc kubenswrapper[4816]: I0311 13:10:05.287096 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553910-xrjfk" Mar 11 13:10:05 crc kubenswrapper[4816]: I0311 13:10:05.712920 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553904-2bx7k"] Mar 11 13:10:05 crc kubenswrapper[4816]: I0311 13:10:05.723521 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553904-2bx7k"] Mar 11 13:10:06 crc kubenswrapper[4816]: I0311 13:10:06.145103 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f176ec9f-47de-4710-a5f3-078403bb4bfb" path="/var/lib/kubelet/pods/f176ec9f-47de-4710-a5f3-078403bb4bfb/volumes" Mar 11 13:10:39 crc kubenswrapper[4816]: I0311 13:10:39.515732 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:10:39 crc kubenswrapper[4816]: I0311 13:10:39.516689 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:10:49 crc kubenswrapper[4816]: I0311 13:10:49.816603 4816 scope.go:117] "RemoveContainer" containerID="c0c120d96d0731c58ebb4a66094eed03724800f299fa6a22258f239a945115e0" Mar 11 13:11:09 crc kubenswrapper[4816]: I0311 13:11:09.515035 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:11:09 crc kubenswrapper[4816]: I0311 13:11:09.515989 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:11:39 crc kubenswrapper[4816]: I0311 13:11:39.515759 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:11:39 crc kubenswrapper[4816]: I0311 13:11:39.516471 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:11:39 crc kubenswrapper[4816]: I0311 13:11:39.516539 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 13:11:39 crc kubenswrapper[4816]: I0311 13:11:39.517153 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 13:11:39 crc kubenswrapper[4816]: I0311 13:11:39.517226 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" gracePeriod=600 Mar 11 13:11:39 crc kubenswrapper[4816]: E0311 13:11:39.663332 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:11:40 crc kubenswrapper[4816]: I0311 13:11:40.205168 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" exitCode=0 Mar 11 13:11:40 crc kubenswrapper[4816]: I0311 13:11:40.205228 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e"} Mar 11 13:11:40 crc kubenswrapper[4816]: I0311 13:11:40.205311 4816 scope.go:117] "RemoveContainer" containerID="93c58f402ba486c6c006c97994ab202bfd22495eff9729c60fbdfcbe918d3c5f" Mar 11 13:11:40 crc kubenswrapper[4816]: I0311 13:11:40.206149 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:11:40 crc kubenswrapper[4816]: E0311 13:11:40.206728 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:11:54 crc kubenswrapper[4816]: I0311 13:11:54.139851 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:11:54 crc kubenswrapper[4816]: E0311 13:11:54.141285 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.178519 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553912-dgw9l"] Mar 11 13:12:00 crc kubenswrapper[4816]: E0311 13:12:00.179910 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee2e218-36ee-47c0-9bca-f2f6affd5b02" containerName="oc" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.179936 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee2e218-36ee-47c0-9bca-f2f6affd5b02" containerName="oc" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.180213 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee2e218-36ee-47c0-9bca-f2f6affd5b02" containerName="oc" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.181091 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553912-dgw9l" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.185105 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st8rr\" (UniqueName: \"kubernetes.io/projected/0cc88fac-43d8-4178-9b36-fc5bd4b04818-kube-api-access-st8rr\") pod \"auto-csr-approver-29553912-dgw9l\" (UID: \"0cc88fac-43d8-4178-9b36-fc5bd4b04818\") " pod="openshift-infra/auto-csr-approver-29553912-dgw9l" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.185543 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.189465 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.194652 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.197992 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553912-dgw9l"] Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.287625 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st8rr\" (UniqueName: \"kubernetes.io/projected/0cc88fac-43d8-4178-9b36-fc5bd4b04818-kube-api-access-st8rr\") pod \"auto-csr-approver-29553912-dgw9l\" (UID: \"0cc88fac-43d8-4178-9b36-fc5bd4b04818\") " pod="openshift-infra/auto-csr-approver-29553912-dgw9l" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.319606 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st8rr\" (UniqueName: \"kubernetes.io/projected/0cc88fac-43d8-4178-9b36-fc5bd4b04818-kube-api-access-st8rr\") pod \"auto-csr-approver-29553912-dgw9l\" (UID: \"0cc88fac-43d8-4178-9b36-fc5bd4b04818\") " pod="openshift-infra/auto-csr-approver-29553912-dgw9l" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.509134 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553912-dgw9l" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.992213 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553912-dgw9l"] Mar 11 13:12:01 crc kubenswrapper[4816]: I0311 13:12:01.413544 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553912-dgw9l" event={"ID":"0cc88fac-43d8-4178-9b36-fc5bd4b04818","Type":"ContainerStarted","Data":"8d920cabe9bb9dc8224b3ba1049bb6821a23c6fcb76f2925fc6fafc3d3fa8a92"} Mar 11 13:12:03 crc kubenswrapper[4816]: I0311 13:12:03.434231 4816 generic.go:334] "Generic (PLEG): container finished" podID="0cc88fac-43d8-4178-9b36-fc5bd4b04818" containerID="148ded4a02efdc34a61cfc1e6b248706834d114bbcd8c2d3fc0a1082e7f112b8" exitCode=0 Mar 11 13:12:03 crc kubenswrapper[4816]: I0311 13:12:03.434342 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553912-dgw9l" event={"ID":"0cc88fac-43d8-4178-9b36-fc5bd4b04818","Type":"ContainerDied","Data":"148ded4a02efdc34a61cfc1e6b248706834d114bbcd8c2d3fc0a1082e7f112b8"} Mar 11 13:12:04 crc kubenswrapper[4816]: I0311 13:12:04.795346 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553912-dgw9l" Mar 11 13:12:04 crc kubenswrapper[4816]: I0311 13:12:04.975800 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st8rr\" (UniqueName: \"kubernetes.io/projected/0cc88fac-43d8-4178-9b36-fc5bd4b04818-kube-api-access-st8rr\") pod \"0cc88fac-43d8-4178-9b36-fc5bd4b04818\" (UID: \"0cc88fac-43d8-4178-9b36-fc5bd4b04818\") " Mar 11 13:12:04 crc kubenswrapper[4816]: I0311 13:12:04.984365 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc88fac-43d8-4178-9b36-fc5bd4b04818-kube-api-access-st8rr" (OuterVolumeSpecName: "kube-api-access-st8rr") pod "0cc88fac-43d8-4178-9b36-fc5bd4b04818" (UID: "0cc88fac-43d8-4178-9b36-fc5bd4b04818"). InnerVolumeSpecName "kube-api-access-st8rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:12:05 crc kubenswrapper[4816]: I0311 13:12:05.077900 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st8rr\" (UniqueName: \"kubernetes.io/projected/0cc88fac-43d8-4178-9b36-fc5bd4b04818-kube-api-access-st8rr\") on node \"crc\" DevicePath \"\"" Mar 11 13:12:05 crc kubenswrapper[4816]: I0311 13:12:05.458293 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553912-dgw9l" event={"ID":"0cc88fac-43d8-4178-9b36-fc5bd4b04818","Type":"ContainerDied","Data":"8d920cabe9bb9dc8224b3ba1049bb6821a23c6fcb76f2925fc6fafc3d3fa8a92"} Mar 11 13:12:05 crc kubenswrapper[4816]: I0311 13:12:05.458353 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d920cabe9bb9dc8224b3ba1049bb6821a23c6fcb76f2925fc6fafc3d3fa8a92" Mar 11 13:12:05 crc kubenswrapper[4816]: I0311 13:12:05.458400 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553912-dgw9l" Mar 11 13:12:05 crc kubenswrapper[4816]: I0311 13:12:05.892223 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553906-4wxn9"] Mar 11 13:12:05 crc kubenswrapper[4816]: I0311 13:12:05.898632 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553906-4wxn9"] Mar 11 13:12:06 crc kubenswrapper[4816]: I0311 13:12:06.141025 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf" path="/var/lib/kubelet/pods/4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf/volumes" Mar 11 13:12:09 crc kubenswrapper[4816]: I0311 13:12:09.130496 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:12:09 crc kubenswrapper[4816]: E0311 13:12:09.132972 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:12:23 crc kubenswrapper[4816]: I0311 13:12:23.131167 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:12:23 crc kubenswrapper[4816]: E0311 13:12:23.132372 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:12:37 crc kubenswrapper[4816]: I0311 13:12:37.130821 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:12:37 crc kubenswrapper[4816]: E0311 13:12:37.131950 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:12:48 crc kubenswrapper[4816]: I0311 13:12:48.131929 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:12:48 crc kubenswrapper[4816]: E0311 13:12:48.133151 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:12:49 crc kubenswrapper[4816]: I0311 13:12:49.908217 4816 scope.go:117] "RemoveContainer" containerID="3049a692892071c6574e8ee18347abb47ed4c1ed532d21f9dde8bcb07555460f" Mar 11 13:13:01 crc kubenswrapper[4816]: I0311 13:13:01.131234 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:13:01 crc kubenswrapper[4816]: E0311 13:13:01.133034 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:13:13 crc kubenswrapper[4816]: I0311 13:13:13.130914 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:13:13 crc kubenswrapper[4816]: E0311 13:13:13.132176 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:13:26 crc kubenswrapper[4816]: I0311 13:13:26.131534 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:13:26 crc kubenswrapper[4816]: E0311 13:13:26.132691 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:13:39 crc kubenswrapper[4816]: I0311 13:13:39.130884 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:13:39 crc kubenswrapper[4816]: E0311 13:13:39.132169 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:13:54 crc kubenswrapper[4816]: I0311 13:13:54.139052 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:13:54 crc kubenswrapper[4816]: E0311 13:13:54.140530 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.176330 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553914-vtpr2"] Mar 11 13:14:00 crc kubenswrapper[4816]: E0311 13:14:00.177562 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc88fac-43d8-4178-9b36-fc5bd4b04818" containerName="oc" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.177579 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc88fac-43d8-4178-9b36-fc5bd4b04818" containerName="oc" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.177773 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc88fac-43d8-4178-9b36-fc5bd4b04818" containerName="oc" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.178444 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553914-vtpr2" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.186392 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.186895 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.187313 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.198156 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553914-vtpr2"] Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.284135 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtwkt\" (UniqueName: \"kubernetes.io/projected/32b79556-cf6a-450f-9214-70d0854dc630-kube-api-access-gtwkt\") pod \"auto-csr-approver-29553914-vtpr2\" (UID: \"32b79556-cf6a-450f-9214-70d0854dc630\") " pod="openshift-infra/auto-csr-approver-29553914-vtpr2" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.386650 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtwkt\" (UniqueName: \"kubernetes.io/projected/32b79556-cf6a-450f-9214-70d0854dc630-kube-api-access-gtwkt\") pod \"auto-csr-approver-29553914-vtpr2\" (UID: \"32b79556-cf6a-450f-9214-70d0854dc630\") " pod="openshift-infra/auto-csr-approver-29553914-vtpr2" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.433005 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtwkt\" (UniqueName: \"kubernetes.io/projected/32b79556-cf6a-450f-9214-70d0854dc630-kube-api-access-gtwkt\") pod \"auto-csr-approver-29553914-vtpr2\" (UID: \"32b79556-cf6a-450f-9214-70d0854dc630\") " pod="openshift-infra/auto-csr-approver-29553914-vtpr2" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.521779 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553914-vtpr2" Mar 11 13:14:01 crc kubenswrapper[4816]: I0311 13:14:01.022127 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553914-vtpr2"] Mar 11 13:14:01 crc kubenswrapper[4816]: I0311 13:14:01.622816 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553914-vtpr2" event={"ID":"32b79556-cf6a-450f-9214-70d0854dc630","Type":"ContainerStarted","Data":"686ff9ffa6818ed33837cab95d1d4a16f7050819c4a25812e70e0cd6dc4bb0fd"} Mar 11 13:14:03 crc kubenswrapper[4816]: I0311 13:14:03.643950 4816 generic.go:334] "Generic (PLEG): container finished" podID="32b79556-cf6a-450f-9214-70d0854dc630" containerID="8e7758cfa0d68340bf0bfe400a0bcdda434a161dca369cd6a56c8194d33e640d" exitCode=0 Mar 11 13:14:03 crc kubenswrapper[4816]: I0311 13:14:03.644758 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553914-vtpr2" event={"ID":"32b79556-cf6a-450f-9214-70d0854dc630","Type":"ContainerDied","Data":"8e7758cfa0d68340bf0bfe400a0bcdda434a161dca369cd6a56c8194d33e640d"} Mar 11 13:14:05 crc kubenswrapper[4816]: I0311 13:14:05.114438 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553914-vtpr2" Mar 11 13:14:05 crc kubenswrapper[4816]: I0311 13:14:05.197837 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtwkt\" (UniqueName: \"kubernetes.io/projected/32b79556-cf6a-450f-9214-70d0854dc630-kube-api-access-gtwkt\") pod \"32b79556-cf6a-450f-9214-70d0854dc630\" (UID: \"32b79556-cf6a-450f-9214-70d0854dc630\") " Mar 11 13:14:05 crc kubenswrapper[4816]: I0311 13:14:05.205495 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b79556-cf6a-450f-9214-70d0854dc630-kube-api-access-gtwkt" (OuterVolumeSpecName: "kube-api-access-gtwkt") pod "32b79556-cf6a-450f-9214-70d0854dc630" (UID: "32b79556-cf6a-450f-9214-70d0854dc630"). InnerVolumeSpecName "kube-api-access-gtwkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:14:05 crc kubenswrapper[4816]: I0311 13:14:05.300345 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtwkt\" (UniqueName: \"kubernetes.io/projected/32b79556-cf6a-450f-9214-70d0854dc630-kube-api-access-gtwkt\") on node \"crc\" DevicePath \"\"" Mar 11 13:14:05 crc kubenswrapper[4816]: I0311 13:14:05.673879 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553914-vtpr2" event={"ID":"32b79556-cf6a-450f-9214-70d0854dc630","Type":"ContainerDied","Data":"686ff9ffa6818ed33837cab95d1d4a16f7050819c4a25812e70e0cd6dc4bb0fd"} Mar 11 13:14:05 crc kubenswrapper[4816]: I0311 13:14:05.673935 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="686ff9ffa6818ed33837cab95d1d4a16f7050819c4a25812e70e0cd6dc4bb0fd" Mar 11 13:14:05 crc kubenswrapper[4816]: I0311 13:14:05.673945 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553914-vtpr2" Mar 11 13:14:06 crc kubenswrapper[4816]: I0311 13:14:06.201775 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553908-7sdtd"] Mar 11 13:14:06 crc kubenswrapper[4816]: I0311 13:14:06.208856 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553908-7sdtd"] Mar 11 13:14:07 crc kubenswrapper[4816]: I0311 13:14:07.130843 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:14:07 crc kubenswrapper[4816]: E0311 13:14:07.131193 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:14:08 crc kubenswrapper[4816]: I0311 13:14:08.145982 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc717b0-ae5a-46c4-9fea-48dcab17a9c8" path="/var/lib/kubelet/pods/9dc717b0-ae5a-46c4-9fea-48dcab17a9c8/volumes" Mar 11 13:14:19 crc kubenswrapper[4816]: I0311 13:14:19.131073 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:14:19 crc kubenswrapper[4816]: E0311 13:14:19.132360 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:14:30 crc kubenswrapper[4816]: I0311 13:14:30.131052 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:14:30 crc kubenswrapper[4816]: E0311 13:14:30.132346 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:14:42 crc kubenswrapper[4816]: I0311 13:14:42.130512 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:14:42 crc kubenswrapper[4816]: E0311 13:14:42.131646 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:14:50 crc kubenswrapper[4816]: I0311 13:14:50.028666 4816 scope.go:117] "RemoveContainer" containerID="8f956a9fd47ed00082f55e7f0e6d344e63382a72a835af03fd720051a5e8b801" Mar 11 13:14:55 crc kubenswrapper[4816]: I0311 13:14:55.133146 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:14:55 crc kubenswrapper[4816]: E0311 13:14:55.135106 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.188513 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d"] Mar 11 13:15:00 crc kubenswrapper[4816]: E0311 13:15:00.189491 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b79556-cf6a-450f-9214-70d0854dc630" containerName="oc" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.189516 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b79556-cf6a-450f-9214-70d0854dc630" containerName="oc" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.189722 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b79556-cf6a-450f-9214-70d0854dc630" containerName="oc" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.190464 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.192821 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.192931 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.198280 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d"] Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.368085 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm2lv\" (UniqueName: \"kubernetes.io/projected/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-kube-api-access-vm2lv\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.369055 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-config-volume\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.384675 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-secret-volume\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.485999 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-secret-volume\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.486132 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm2lv\" (UniqueName: \"kubernetes.io/projected/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-kube-api-access-vm2lv\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.486182 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-config-volume\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.488039 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-config-volume\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.507712 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-secret-volume\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.509160 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm2lv\" (UniqueName: \"kubernetes.io/projected/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-kube-api-access-vm2lv\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.559556 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:01 crc kubenswrapper[4816]: I0311 13:15:01.054596 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d"] Mar 11 13:15:01 crc kubenswrapper[4816]: I0311 13:15:01.406320 4816 generic.go:334] "Generic (PLEG): container finished" podID="5bfbb073-59ae-4e0f-9b46-4f27865d35dd" containerID="f6b3186ed4fea575de80650b31d2a10afb3b6a19453228e0dffc9785f41cc4e3" exitCode=0 Mar 11 13:15:01 crc kubenswrapper[4816]: I0311 13:15:01.406379 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" event={"ID":"5bfbb073-59ae-4e0f-9b46-4f27865d35dd","Type":"ContainerDied","Data":"f6b3186ed4fea575de80650b31d2a10afb3b6a19453228e0dffc9785f41cc4e3"} Mar 11 13:15:01 crc kubenswrapper[4816]: I0311 13:15:01.406444 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" event={"ID":"5bfbb073-59ae-4e0f-9b46-4f27865d35dd","Type":"ContainerStarted","Data":"6c8d88f488341419dac3054536326536c7f997ffe41c41a21191b1dcd393ffd5"} Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.447452 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4xkfb"] Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.450618 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.470981 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4xkfb"] Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.617635 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-utilities\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.617692 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w48bm\" (UniqueName: \"kubernetes.io/projected/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-kube-api-access-w48bm\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.617862 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-catalog-content\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.718371 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-utilities\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.718420 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w48bm\" (UniqueName: \"kubernetes.io/projected/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-kube-api-access-w48bm\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.718479 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-catalog-content\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.719479 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-catalog-content\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.719514 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-utilities\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.752086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w48bm\" (UniqueName: \"kubernetes.io/projected/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-kube-api-access-w48bm\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.797509 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.800692 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.920988 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-config-volume\") pod \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.921073 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm2lv\" (UniqueName: \"kubernetes.io/projected/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-kube-api-access-vm2lv\") pod \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.921137 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-secret-volume\") pod \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.923205 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-config-volume" (OuterVolumeSpecName: "config-volume") pod "5bfbb073-59ae-4e0f-9b46-4f27865d35dd" (UID: "5bfbb073-59ae-4e0f-9b46-4f27865d35dd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.924198 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5bfbb073-59ae-4e0f-9b46-4f27865d35dd" (UID: "5bfbb073-59ae-4e0f-9b46-4f27865d35dd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.924588 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-kube-api-access-vm2lv" (OuterVolumeSpecName: "kube-api-access-vm2lv") pod "5bfbb073-59ae-4e0f-9b46-4f27865d35dd" (UID: "5bfbb073-59ae-4e0f-9b46-4f27865d35dd"). InnerVolumeSpecName "kube-api-access-vm2lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.022724 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.022767 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm2lv\" (UniqueName: \"kubernetes.io/projected/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-kube-api-access-vm2lv\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.022782 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.039940 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4xkfb"] Mar 11 13:15:03 crc kubenswrapper[4816]: W0311 13:15:03.046216 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d9acb13_e6a9_4833_8cfe_3801fd85e2a5.slice/crio-870dbd348776e9d3256ca77f7c640c08bd1a93d1453b571cc8e66d309b98de54 WatchSource:0}: Error finding container 870dbd348776e9d3256ca77f7c640c08bd1a93d1453b571cc8e66d309b98de54: Status 404 returned error can't find the container with id 870dbd348776e9d3256ca77f7c640c08bd1a93d1453b571cc8e66d309b98de54 Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.424741 4816 generic.go:334] "Generic (PLEG): container finished" podID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerID="18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b" exitCode=0 Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.424976 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xkfb" event={"ID":"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5","Type":"ContainerDied","Data":"18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b"} Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.425004 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xkfb" event={"ID":"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5","Type":"ContainerStarted","Data":"870dbd348776e9d3256ca77f7c640c08bd1a93d1453b571cc8e66d309b98de54"} Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.426530 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.427881 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" event={"ID":"5bfbb073-59ae-4e0f-9b46-4f27865d35dd","Type":"ContainerDied","Data":"6c8d88f488341419dac3054536326536c7f997ffe41c41a21191b1dcd393ffd5"} Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.427909 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c8d88f488341419dac3054536326536c7f997ffe41c41a21191b1dcd393ffd5" Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.427945 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.922507 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j"] Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.928888 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j"] Mar 11 13:15:04 crc kubenswrapper[4816]: I0311 13:15:04.164972 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" path="/var/lib/kubelet/pods/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81/volumes" Mar 11 13:15:05 crc kubenswrapper[4816]: I0311 13:15:05.447538 4816 generic.go:334] "Generic (PLEG): container finished" podID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerID="cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9" exitCode=0 Mar 11 13:15:05 crc kubenswrapper[4816]: I0311 13:15:05.447630 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xkfb" event={"ID":"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5","Type":"ContainerDied","Data":"cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9"} Mar 11 13:15:06 crc kubenswrapper[4816]: I0311 13:15:06.458121 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xkfb" event={"ID":"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5","Type":"ContainerStarted","Data":"dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf"} Mar 11 13:15:06 crc kubenswrapper[4816]: I0311 13:15:06.501745 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4xkfb" podStartSLOduration=1.9507583560000001 podStartE2EDuration="4.501713162s" podCreationTimestamp="2026-03-11 13:15:02 +0000 UTC" firstStartedPulling="2026-03-11 13:15:03.426322761 +0000 UTC m=+4590.017586728" lastFinishedPulling="2026-03-11 13:15:05.977277537 +0000 UTC m=+4592.568541534" observedRunningTime="2026-03-11 13:15:06.486783785 +0000 UTC m=+4593.078047752" watchObservedRunningTime="2026-03-11 13:15:06.501713162 +0000 UTC m=+4593.092977169" Mar 11 13:15:09 crc kubenswrapper[4816]: I0311 13:15:09.130991 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:15:09 crc kubenswrapper[4816]: E0311 13:15:09.133659 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:15:12 crc kubenswrapper[4816]: I0311 13:15:12.798854 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:12 crc kubenswrapper[4816]: I0311 13:15:12.799452 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:12 crc kubenswrapper[4816]: I0311 13:15:12.889852 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:13 crc kubenswrapper[4816]: I0311 13:15:13.604921 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:13 crc kubenswrapper[4816]: I0311 13:15:13.664443 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4xkfb"] Mar 11 13:15:15 crc kubenswrapper[4816]: I0311 13:15:15.549446 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4xkfb" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="registry-server" containerID="cri-o://dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf" gracePeriod=2 Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.455162 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.557823 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-catalog-content\") pod \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.557898 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-utilities\") pod \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.557964 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w48bm\" (UniqueName: \"kubernetes.io/projected/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-kube-api-access-w48bm\") pod \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.560117 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-utilities" (OuterVolumeSpecName: "utilities") pod "2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" (UID: "2d9acb13-e6a9-4833-8cfe-3801fd85e2a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.561961 4816 generic.go:334] "Generic (PLEG): container finished" podID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerID="dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf" exitCode=0 Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.562025 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xkfb" event={"ID":"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5","Type":"ContainerDied","Data":"dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf"} Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.562067 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xkfb" event={"ID":"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5","Type":"ContainerDied","Data":"870dbd348776e9d3256ca77f7c640c08bd1a93d1453b571cc8e66d309b98de54"} Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.562090 4816 scope.go:117] "RemoveContainer" containerID="dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.562216 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.567731 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-kube-api-access-w48bm" (OuterVolumeSpecName: "kube-api-access-w48bm") pod "2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" (UID: "2d9acb13-e6a9-4833-8cfe-3801fd85e2a5"). InnerVolumeSpecName "kube-api-access-w48bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.619155 4816 scope.go:117] "RemoveContainer" containerID="cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.629937 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" (UID: "2d9acb13-e6a9-4833-8cfe-3801fd85e2a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.638973 4816 scope.go:117] "RemoveContainer" containerID="18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.660015 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w48bm\" (UniqueName: \"kubernetes.io/projected/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-kube-api-access-w48bm\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.660051 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.660061 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.679761 4816 scope.go:117] "RemoveContainer" containerID="dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf" Mar 11 13:15:16 crc kubenswrapper[4816]: E0311 13:15:16.680716 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf\": container with ID starting with dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf not found: ID does not exist" containerID="dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.680816 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf"} err="failed to get container status \"dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf\": rpc error: code = NotFound desc = could not find container \"dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf\": container with ID starting with dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf not found: ID does not exist" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.680874 4816 scope.go:117] "RemoveContainer" containerID="cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9" Mar 11 13:15:16 crc kubenswrapper[4816]: E0311 13:15:16.681351 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9\": container with ID starting with cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9 not found: ID does not exist" containerID="cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.681605 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9"} err="failed to get container status \"cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9\": rpc error: code = NotFound desc = could not find container \"cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9\": container with ID starting with cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9 not found: ID does not exist" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.681660 4816 scope.go:117] "RemoveContainer" containerID="18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b" Mar 11 13:15:16 crc kubenswrapper[4816]: E0311 13:15:16.682101 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b\": container with ID starting with 18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b not found: ID does not exist" containerID="18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.682172 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b"} err="failed to get container status \"18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b\": rpc error: code = NotFound desc = could not find container \"18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b\": container with ID starting with 18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b not found: ID does not exist" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.896500 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4xkfb"] Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.905110 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4xkfb"] Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.146447 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" path="/var/lib/kubelet/pods/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5/volumes" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.218173 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tbw5k"] Mar 11 13:15:18 crc kubenswrapper[4816]: E0311 13:15:18.219191 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="registry-server" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.219224 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="registry-server" Mar 11 13:15:18 crc kubenswrapper[4816]: E0311 13:15:18.219300 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="extract-content" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.219314 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="extract-content" Mar 11 13:15:18 crc kubenswrapper[4816]: E0311 13:15:18.219332 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="extract-utilities" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.219346 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="extract-utilities" Mar 11 13:15:18 crc kubenswrapper[4816]: E0311 13:15:18.219372 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bfbb073-59ae-4e0f-9b46-4f27865d35dd" containerName="collect-profiles" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.219384 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bfbb073-59ae-4e0f-9b46-4f27865d35dd" containerName="collect-profiles" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.219700 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="registry-server" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.219742 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bfbb073-59ae-4e0f-9b46-4f27865d35dd" containerName="collect-profiles" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.225885 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.243344 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tbw5k"] Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.288727 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-catalog-content\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.288871 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-utilities\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.289012 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnmg6\" (UniqueName: \"kubernetes.io/projected/1f414bfb-3cb5-4b0c-a92b-7333284def08-kube-api-access-tnmg6\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.390977 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-catalog-content\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.391072 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-utilities\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.391177 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnmg6\" (UniqueName: \"kubernetes.io/projected/1f414bfb-3cb5-4b0c-a92b-7333284def08-kube-api-access-tnmg6\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.391596 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-catalog-content\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.391841 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-utilities\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.416806 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnmg6\" (UniqueName: \"kubernetes.io/projected/1f414bfb-3cb5-4b0c-a92b-7333284def08-kube-api-access-tnmg6\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.563967 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:19 crc kubenswrapper[4816]: I0311 13:15:19.087126 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tbw5k"] Mar 11 13:15:19 crc kubenswrapper[4816]: I0311 13:15:19.589300 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerID="38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb" exitCode=0 Mar 11 13:15:19 crc kubenswrapper[4816]: I0311 13:15:19.589366 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbw5k" event={"ID":"1f414bfb-3cb5-4b0c-a92b-7333284def08","Type":"ContainerDied","Data":"38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb"} Mar 11 13:15:19 crc kubenswrapper[4816]: I0311 13:15:19.589403 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbw5k" event={"ID":"1f414bfb-3cb5-4b0c-a92b-7333284def08","Type":"ContainerStarted","Data":"464df353d128096eb123c7f056ec9513c5bfab8441350c87f677c6c358b07739"} Mar 11 13:15:21 crc kubenswrapper[4816]: I0311 13:15:21.130635 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:15:21 crc kubenswrapper[4816]: E0311 13:15:21.131272 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:15:21 crc kubenswrapper[4816]: I0311 13:15:21.613538 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerID="18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a" exitCode=0 Mar 11 13:15:21 crc kubenswrapper[4816]: I0311 13:15:21.613607 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbw5k" event={"ID":"1f414bfb-3cb5-4b0c-a92b-7333284def08","Type":"ContainerDied","Data":"18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a"} Mar 11 13:15:22 crc kubenswrapper[4816]: I0311 13:15:22.626027 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbw5k" event={"ID":"1f414bfb-3cb5-4b0c-a92b-7333284def08","Type":"ContainerStarted","Data":"57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750"} Mar 11 13:15:22 crc kubenswrapper[4816]: I0311 13:15:22.668621 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tbw5k" podStartSLOduration=2.160828175 podStartE2EDuration="4.668577746s" podCreationTimestamp="2026-03-11 13:15:18 +0000 UTC" firstStartedPulling="2026-03-11 13:15:19.591204768 +0000 UTC m=+4606.182468735" lastFinishedPulling="2026-03-11 13:15:22.098954329 +0000 UTC m=+4608.690218306" observedRunningTime="2026-03-11 13:15:22.655701948 +0000 UTC m=+4609.246965935" watchObservedRunningTime="2026-03-11 13:15:22.668577746 +0000 UTC m=+4609.259841763" Mar 11 13:15:28 crc kubenswrapper[4816]: I0311 13:15:28.564771 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:28 crc kubenswrapper[4816]: I0311 13:15:28.566289 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:29 crc kubenswrapper[4816]: I0311 13:15:29.630153 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tbw5k" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="registry-server" probeResult="failure" output=< Mar 11 13:15:29 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 13:15:29 crc kubenswrapper[4816]: > Mar 11 13:15:36 crc kubenswrapper[4816]: I0311 13:15:36.131302 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:15:36 crc kubenswrapper[4816]: E0311 13:15:36.132131 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:15:38 crc kubenswrapper[4816]: I0311 13:15:38.635703 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:38 crc kubenswrapper[4816]: I0311 13:15:38.702880 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:38 crc kubenswrapper[4816]: I0311 13:15:38.887237 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tbw5k"] Mar 11 13:15:39 crc kubenswrapper[4816]: I0311 13:15:39.774818 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tbw5k" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="registry-server" containerID="cri-o://57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750" gracePeriod=2 Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.208011 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.357430 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-catalog-content\") pod \"1f414bfb-3cb5-4b0c-a92b-7333284def08\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.357493 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnmg6\" (UniqueName: \"kubernetes.io/projected/1f414bfb-3cb5-4b0c-a92b-7333284def08-kube-api-access-tnmg6\") pod \"1f414bfb-3cb5-4b0c-a92b-7333284def08\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.357628 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-utilities\") pod \"1f414bfb-3cb5-4b0c-a92b-7333284def08\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.358543 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-utilities" (OuterVolumeSpecName: "utilities") pod "1f414bfb-3cb5-4b0c-a92b-7333284def08" (UID: "1f414bfb-3cb5-4b0c-a92b-7333284def08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.369677 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f414bfb-3cb5-4b0c-a92b-7333284def08-kube-api-access-tnmg6" (OuterVolumeSpecName: "kube-api-access-tnmg6") pod "1f414bfb-3cb5-4b0c-a92b-7333284def08" (UID: "1f414bfb-3cb5-4b0c-a92b-7333284def08"). InnerVolumeSpecName "kube-api-access-tnmg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.459800 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnmg6\" (UniqueName: \"kubernetes.io/projected/1f414bfb-3cb5-4b0c-a92b-7333284def08-kube-api-access-tnmg6\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.459834 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.504997 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f414bfb-3cb5-4b0c-a92b-7333284def08" (UID: "1f414bfb-3cb5-4b0c-a92b-7333284def08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.561312 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.786292 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerID="57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750" exitCode=0 Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.786361 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbw5k" event={"ID":"1f414bfb-3cb5-4b0c-a92b-7333284def08","Type":"ContainerDied","Data":"57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750"} Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.786404 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbw5k" event={"ID":"1f414bfb-3cb5-4b0c-a92b-7333284def08","Type":"ContainerDied","Data":"464df353d128096eb123c7f056ec9513c5bfab8441350c87f677c6c358b07739"} Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.786432 4816 scope.go:117] "RemoveContainer" containerID="57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.786650 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.815410 4816 scope.go:117] "RemoveContainer" containerID="18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.838752 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tbw5k"] Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.849512 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tbw5k"] Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.867442 4816 scope.go:117] "RemoveContainer" containerID="38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.908415 4816 scope.go:117] "RemoveContainer" containerID="57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750" Mar 11 13:15:40 crc kubenswrapper[4816]: E0311 13:15:40.909354 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750\": container with ID starting with 57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750 not found: ID does not exist" containerID="57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.909494 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750"} err="failed to get container status \"57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750\": rpc error: code = NotFound desc = could not find container \"57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750\": container with ID starting with 57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750 not found: ID does not exist" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.909539 4816 scope.go:117] "RemoveContainer" containerID="18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a" Mar 11 13:15:40 crc kubenswrapper[4816]: E0311 13:15:40.910107 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a\": container with ID starting with 18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a not found: ID does not exist" containerID="18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.910240 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a"} err="failed to get container status \"18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a\": rpc error: code = NotFound desc = could not find container \"18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a\": container with ID starting with 18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a not found: ID does not exist" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.910309 4816 scope.go:117] "RemoveContainer" containerID="38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb" Mar 11 13:15:40 crc kubenswrapper[4816]: E0311 13:15:40.910782 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb\": container with ID starting with 38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb not found: ID does not exist" containerID="38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.910929 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb"} err="failed to get container status \"38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb\": rpc error: code = NotFound desc = could not find container \"38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb\": container with ID starting with 38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb not found: ID does not exist" Mar 11 13:15:42 crc kubenswrapper[4816]: I0311 13:15:42.141269 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" path="/var/lib/kubelet/pods/1f414bfb-3cb5-4b0c-a92b-7333284def08/volumes" Mar 11 13:15:50 crc kubenswrapper[4816]: I0311 13:15:50.124550 4816 scope.go:117] "RemoveContainer" containerID="df1d35d17e400d5b7e626c6af7307f8e5a96cbf6b1e197b1b3bcbb3209f59864" Mar 11 13:15:51 crc kubenswrapper[4816]: I0311 13:15:51.130776 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:15:51 crc kubenswrapper[4816]: E0311 13:15:51.131337 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.169780 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553916-n8r8p"] Mar 11 13:16:00 crc kubenswrapper[4816]: E0311 13:16:00.171085 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="extract-content" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.171108 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="extract-content" Mar 11 13:16:00 crc kubenswrapper[4816]: E0311 13:16:00.171128 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="registry-server" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.171141 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="registry-server" Mar 11 13:16:00 crc kubenswrapper[4816]: E0311 13:16:00.171179 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="extract-utilities" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.171192 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="extract-utilities" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.171456 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="registry-server" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.172236 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553916-n8r8p" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.181142 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.181301 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.181706 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.189859 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk2ml\" (UniqueName: \"kubernetes.io/projected/ae7a49f0-ca01-4ad5-a353-5ac125523d95-kube-api-access-jk2ml\") pod \"auto-csr-approver-29553916-n8r8p\" (UID: \"ae7a49f0-ca01-4ad5-a353-5ac125523d95\") " pod="openshift-infra/auto-csr-approver-29553916-n8r8p" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.196047 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553916-n8r8p"] Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.294045 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk2ml\" (UniqueName: \"kubernetes.io/projected/ae7a49f0-ca01-4ad5-a353-5ac125523d95-kube-api-access-jk2ml\") pod \"auto-csr-approver-29553916-n8r8p\" (UID: \"ae7a49f0-ca01-4ad5-a353-5ac125523d95\") " pod="openshift-infra/auto-csr-approver-29553916-n8r8p" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.336301 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk2ml\" (UniqueName: \"kubernetes.io/projected/ae7a49f0-ca01-4ad5-a353-5ac125523d95-kube-api-access-jk2ml\") pod \"auto-csr-approver-29553916-n8r8p\" (UID: \"ae7a49f0-ca01-4ad5-a353-5ac125523d95\") " pod="openshift-infra/auto-csr-approver-29553916-n8r8p" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.528668 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553916-n8r8p" Mar 11 13:16:01 crc kubenswrapper[4816]: I0311 13:16:01.002714 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553916-n8r8p"] Mar 11 13:16:01 crc kubenswrapper[4816]: I0311 13:16:01.975557 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553916-n8r8p" event={"ID":"ae7a49f0-ca01-4ad5-a353-5ac125523d95","Type":"ContainerStarted","Data":"7edcf5e255ce59b048a2730a5399b98ab9872184268a4d92c2000b54fbb80e71"} Mar 11 13:16:02 crc kubenswrapper[4816]: I0311 13:16:02.988753 4816 generic.go:334] "Generic (PLEG): container finished" podID="ae7a49f0-ca01-4ad5-a353-5ac125523d95" containerID="e4b94bbef2f14a1e765d933fe579ccf92b49db99e68b93650802fa89e27f09ad" exitCode=0 Mar 11 13:16:02 crc kubenswrapper[4816]: I0311 13:16:02.988910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553916-n8r8p" event={"ID":"ae7a49f0-ca01-4ad5-a353-5ac125523d95","Type":"ContainerDied","Data":"e4b94bbef2f14a1e765d933fe579ccf92b49db99e68b93650802fa89e27f09ad"} Mar 11 13:16:03 crc kubenswrapper[4816]: I0311 13:16:03.131076 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:16:03 crc kubenswrapper[4816]: E0311 13:16:03.131381 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:16:04 crc kubenswrapper[4816]: I0311 13:16:04.310993 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553916-n8r8p" Mar 11 13:16:04 crc kubenswrapper[4816]: I0311 13:16:04.363145 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk2ml\" (UniqueName: \"kubernetes.io/projected/ae7a49f0-ca01-4ad5-a353-5ac125523d95-kube-api-access-jk2ml\") pod \"ae7a49f0-ca01-4ad5-a353-5ac125523d95\" (UID: \"ae7a49f0-ca01-4ad5-a353-5ac125523d95\") " Mar 11 13:16:04 crc kubenswrapper[4816]: I0311 13:16:04.372258 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae7a49f0-ca01-4ad5-a353-5ac125523d95-kube-api-access-jk2ml" (OuterVolumeSpecName: "kube-api-access-jk2ml") pod "ae7a49f0-ca01-4ad5-a353-5ac125523d95" (UID: "ae7a49f0-ca01-4ad5-a353-5ac125523d95"). InnerVolumeSpecName "kube-api-access-jk2ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:16:04 crc kubenswrapper[4816]: I0311 13:16:04.464406 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk2ml\" (UniqueName: \"kubernetes.io/projected/ae7a49f0-ca01-4ad5-a353-5ac125523d95-kube-api-access-jk2ml\") on node \"crc\" DevicePath \"\"" Mar 11 13:16:05 crc kubenswrapper[4816]: I0311 13:16:05.007691 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553916-n8r8p" event={"ID":"ae7a49f0-ca01-4ad5-a353-5ac125523d95","Type":"ContainerDied","Data":"7edcf5e255ce59b048a2730a5399b98ab9872184268a4d92c2000b54fbb80e71"} Mar 11 13:16:05 crc kubenswrapper[4816]: I0311 13:16:05.007732 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7edcf5e255ce59b048a2730a5399b98ab9872184268a4d92c2000b54fbb80e71" Mar 11 13:16:05 crc kubenswrapper[4816]: I0311 13:16:05.007764 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553916-n8r8p" Mar 11 13:16:05 crc kubenswrapper[4816]: I0311 13:16:05.406401 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553910-xrjfk"] Mar 11 13:16:05 crc kubenswrapper[4816]: I0311 13:16:05.416696 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553910-xrjfk"] Mar 11 13:16:06 crc kubenswrapper[4816]: I0311 13:16:06.148769 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ee2e218-36ee-47c0-9bca-f2f6affd5b02" path="/var/lib/kubelet/pods/4ee2e218-36ee-47c0-9bca-f2f6affd5b02/volumes" Mar 11 13:16:18 crc kubenswrapper[4816]: I0311 13:16:18.131397 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:16:18 crc kubenswrapper[4816]: E0311 13:16:18.133019 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:16:31 crc kubenswrapper[4816]: I0311 13:16:31.130152 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:16:31 crc kubenswrapper[4816]: E0311 13:16:31.131295 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:16:43 crc kubenswrapper[4816]: I0311 13:16:43.130634 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:16:44 crc kubenswrapper[4816]: I0311 13:16:44.405644 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"ddd6136328dc7ec62752abe3735d43f3f986aeada7e2653f4b4a88d5e086c6c4"} Mar 11 13:16:50 crc kubenswrapper[4816]: I0311 13:16:50.230966 4816 scope.go:117] "RemoveContainer" containerID="13136e90ba59855de085b0d87fba900a964c210d6db5608d7bd773e44d7b1505" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.173979 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553918-7rzs7"] Mar 11 13:18:00 crc kubenswrapper[4816]: E0311 13:18:00.175427 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7a49f0-ca01-4ad5-a353-5ac125523d95" containerName="oc" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.175463 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7a49f0-ca01-4ad5-a353-5ac125523d95" containerName="oc" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.175776 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7a49f0-ca01-4ad5-a353-5ac125523d95" containerName="oc" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.176810 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553918-7rzs7" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.179709 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.180864 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.181520 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.183952 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553918-7rzs7"] Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.302926 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7mbx\" (UniqueName: \"kubernetes.io/projected/d95b7b2b-acc3-47bd-b762-29e39ca68f93-kube-api-access-f7mbx\") pod \"auto-csr-approver-29553918-7rzs7\" (UID: \"d95b7b2b-acc3-47bd-b762-29e39ca68f93\") " pod="openshift-infra/auto-csr-approver-29553918-7rzs7" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.405149 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7mbx\" (UniqueName: \"kubernetes.io/projected/d95b7b2b-acc3-47bd-b762-29e39ca68f93-kube-api-access-f7mbx\") pod \"auto-csr-approver-29553918-7rzs7\" (UID: \"d95b7b2b-acc3-47bd-b762-29e39ca68f93\") " pod="openshift-infra/auto-csr-approver-29553918-7rzs7" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.432005 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7mbx\" (UniqueName: \"kubernetes.io/projected/d95b7b2b-acc3-47bd-b762-29e39ca68f93-kube-api-access-f7mbx\") pod \"auto-csr-approver-29553918-7rzs7\" (UID: \"d95b7b2b-acc3-47bd-b762-29e39ca68f93\") " pod="openshift-infra/auto-csr-approver-29553918-7rzs7" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.505003 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553918-7rzs7" Mar 11 13:18:01 crc kubenswrapper[4816]: I0311 13:18:01.017635 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553918-7rzs7"] Mar 11 13:18:01 crc kubenswrapper[4816]: I0311 13:18:01.175454 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553918-7rzs7" event={"ID":"d95b7b2b-acc3-47bd-b762-29e39ca68f93","Type":"ContainerStarted","Data":"4b6c1548264a5a5148c9addf189154139bb8f86400a8b334ab4e20ba6beb974e"} Mar 11 13:18:03 crc kubenswrapper[4816]: I0311 13:18:03.197302 4816 generic.go:334] "Generic (PLEG): container finished" podID="d95b7b2b-acc3-47bd-b762-29e39ca68f93" containerID="740acfe6fc04d23ba8749fd0de9541e5bd0ee02db427a2bd65a7b93925e05ec4" exitCode=0 Mar 11 13:18:03 crc kubenswrapper[4816]: I0311 13:18:03.197418 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553918-7rzs7" event={"ID":"d95b7b2b-acc3-47bd-b762-29e39ca68f93","Type":"ContainerDied","Data":"740acfe6fc04d23ba8749fd0de9541e5bd0ee02db427a2bd65a7b93925e05ec4"} Mar 11 13:18:04 crc kubenswrapper[4816]: I0311 13:18:04.578981 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553918-7rzs7" Mar 11 13:18:04 crc kubenswrapper[4816]: I0311 13:18:04.776019 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7mbx\" (UniqueName: \"kubernetes.io/projected/d95b7b2b-acc3-47bd-b762-29e39ca68f93-kube-api-access-f7mbx\") pod \"d95b7b2b-acc3-47bd-b762-29e39ca68f93\" (UID: \"d95b7b2b-acc3-47bd-b762-29e39ca68f93\") " Mar 11 13:18:04 crc kubenswrapper[4816]: I0311 13:18:04.783662 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d95b7b2b-acc3-47bd-b762-29e39ca68f93-kube-api-access-f7mbx" (OuterVolumeSpecName: "kube-api-access-f7mbx") pod "d95b7b2b-acc3-47bd-b762-29e39ca68f93" (UID: "d95b7b2b-acc3-47bd-b762-29e39ca68f93"). InnerVolumeSpecName "kube-api-access-f7mbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:18:04 crc kubenswrapper[4816]: I0311 13:18:04.878294 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7mbx\" (UniqueName: \"kubernetes.io/projected/d95b7b2b-acc3-47bd-b762-29e39ca68f93-kube-api-access-f7mbx\") on node \"crc\" DevicePath \"\"" Mar 11 13:18:05 crc kubenswrapper[4816]: I0311 13:18:05.219067 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553918-7rzs7" event={"ID":"d95b7b2b-acc3-47bd-b762-29e39ca68f93","Type":"ContainerDied","Data":"4b6c1548264a5a5148c9addf189154139bb8f86400a8b334ab4e20ba6beb974e"} Mar 11 13:18:05 crc kubenswrapper[4816]: I0311 13:18:05.219134 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b6c1548264a5a5148c9addf189154139bb8f86400a8b334ab4e20ba6beb974e" Mar 11 13:18:05 crc kubenswrapper[4816]: I0311 13:18:05.219164 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553918-7rzs7" Mar 11 13:18:05 crc kubenswrapper[4816]: I0311 13:18:05.689475 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553912-dgw9l"] Mar 11 13:18:05 crc kubenswrapper[4816]: I0311 13:18:05.693556 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553912-dgw9l"] Mar 11 13:18:06 crc kubenswrapper[4816]: I0311 13:18:06.148077 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc88fac-43d8-4178-9b36-fc5bd4b04818" path="/var/lib/kubelet/pods/0cc88fac-43d8-4178-9b36-fc5bd4b04818/volumes" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.588080 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2l2xd"] Mar 11 13:18:20 crc kubenswrapper[4816]: E0311 13:18:20.591672 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95b7b2b-acc3-47bd-b762-29e39ca68f93" containerName="oc" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.591876 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95b7b2b-acc3-47bd-b762-29e39ca68f93" containerName="oc" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.592328 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d95b7b2b-acc3-47bd-b762-29e39ca68f93" containerName="oc" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.594587 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.607065 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2l2xd"] Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.691684 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-catalog-content\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.691746 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-utilities\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.691779 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2l74\" (UniqueName: \"kubernetes.io/projected/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-kube-api-access-j2l74\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.792472 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-catalog-content\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.792860 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-utilities\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.793029 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2l74\" (UniqueName: \"kubernetes.io/projected/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-kube-api-access-j2l74\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.793224 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-catalog-content\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.793459 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-utilities\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.819686 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2l74\" (UniqueName: \"kubernetes.io/projected/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-kube-api-access-j2l74\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.932412 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:21 crc kubenswrapper[4816]: I0311 13:18:21.478304 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2l2xd"] Mar 11 13:18:22 crc kubenswrapper[4816]: I0311 13:18:22.386443 4816 generic.go:334] "Generic (PLEG): container finished" podID="a13e0873-9c6c-46d7-b0bf-4ef50c40a918" containerID="283b0a0d57fd31208c4a0ce99b6815b3e12f2f8da3c601f5ae8cc6bcdd24a22b" exitCode=0 Mar 11 13:18:22 crc kubenswrapper[4816]: I0311 13:18:22.386511 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2l2xd" event={"ID":"a13e0873-9c6c-46d7-b0bf-4ef50c40a918","Type":"ContainerDied","Data":"283b0a0d57fd31208c4a0ce99b6815b3e12f2f8da3c601f5ae8cc6bcdd24a22b"} Mar 11 13:18:22 crc kubenswrapper[4816]: I0311 13:18:22.386759 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2l2xd" event={"ID":"a13e0873-9c6c-46d7-b0bf-4ef50c40a918","Type":"ContainerStarted","Data":"90e3952219eafce29019495f39895e1b333e7da922f783f7118925b2341c1489"} Mar 11 13:18:27 crc kubenswrapper[4816]: I0311 13:18:27.423572 4816 generic.go:334] "Generic (PLEG): container finished" podID="a13e0873-9c6c-46d7-b0bf-4ef50c40a918" containerID="5db91ea8179782dc441b36bffafd7a6d8f6d49078d5ae0b5acab10829f852071" exitCode=0 Mar 11 13:18:27 crc kubenswrapper[4816]: I0311 13:18:27.424179 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2l2xd" event={"ID":"a13e0873-9c6c-46d7-b0bf-4ef50c40a918","Type":"ContainerDied","Data":"5db91ea8179782dc441b36bffafd7a6d8f6d49078d5ae0b5acab10829f852071"} Mar 11 13:18:29 crc kubenswrapper[4816]: I0311 13:18:29.450874 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2l2xd" event={"ID":"a13e0873-9c6c-46d7-b0bf-4ef50c40a918","Type":"ContainerStarted","Data":"6c9b8a20bb7fcfdf0014b6b92108f77b550658cc551ad24c0d410cf73f181bed"} Mar 11 13:18:29 crc kubenswrapper[4816]: I0311 13:18:29.472224 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2l2xd" podStartSLOduration=3.071756606 podStartE2EDuration="9.472198267s" podCreationTimestamp="2026-03-11 13:18:20 +0000 UTC" firstStartedPulling="2026-03-11 13:18:22.388174401 +0000 UTC m=+4788.979438368" lastFinishedPulling="2026-03-11 13:18:28.788616062 +0000 UTC m=+4795.379880029" observedRunningTime="2026-03-11 13:18:29.466514615 +0000 UTC m=+4796.057778592" watchObservedRunningTime="2026-03-11 13:18:29.472198267 +0000 UTC m=+4796.063462234" Mar 11 13:18:30 crc kubenswrapper[4816]: I0311 13:18:30.932902 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:30 crc kubenswrapper[4816]: I0311 13:18:30.933370 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:32 crc kubenswrapper[4816]: I0311 13:18:32.002740 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-2l2xd" podUID="a13e0873-9c6c-46d7-b0bf-4ef50c40a918" containerName="registry-server" probeResult="failure" output=< Mar 11 13:18:32 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 13:18:32 crc kubenswrapper[4816]: > Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.008170 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.076091 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.161686 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2l2xd"] Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.263024 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qb5pd"] Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.263440 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qb5pd" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="registry-server" containerID="cri-o://df036ebc1022629bd7df15b57ae8610b239015cc838a88645d459c84c864e336" gracePeriod=2 Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.569544 4816 generic.go:334] "Generic (PLEG): container finished" podID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerID="df036ebc1022629bd7df15b57ae8610b239015cc838a88645d459c84c864e336" exitCode=0 Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.569633 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb5pd" event={"ID":"963d27c0-f203-4997-aa60-ac73d2a54cc0","Type":"ContainerDied","Data":"df036ebc1022629bd7df15b57ae8610b239015cc838a88645d459c84c864e336"} Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.672842 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.874110 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f44xj\" (UniqueName: \"kubernetes.io/projected/963d27c0-f203-4997-aa60-ac73d2a54cc0-kube-api-access-f44xj\") pod \"963d27c0-f203-4997-aa60-ac73d2a54cc0\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.874433 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-utilities\") pod \"963d27c0-f203-4997-aa60-ac73d2a54cc0\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.874495 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-catalog-content\") pod \"963d27c0-f203-4997-aa60-ac73d2a54cc0\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.874964 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-utilities" (OuterVolumeSpecName: "utilities") pod "963d27c0-f203-4997-aa60-ac73d2a54cc0" (UID: "963d27c0-f203-4997-aa60-ac73d2a54cc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.886993 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/963d27c0-f203-4997-aa60-ac73d2a54cc0-kube-api-access-f44xj" (OuterVolumeSpecName: "kube-api-access-f44xj") pod "963d27c0-f203-4997-aa60-ac73d2a54cc0" (UID: "963d27c0-f203-4997-aa60-ac73d2a54cc0"). InnerVolumeSpecName "kube-api-access-f44xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.940306 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "963d27c0-f203-4997-aa60-ac73d2a54cc0" (UID: "963d27c0-f203-4997-aa60-ac73d2a54cc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.975526 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.975566 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.975578 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f44xj\" (UniqueName: \"kubernetes.io/projected/963d27c0-f203-4997-aa60-ac73d2a54cc0-kube-api-access-f44xj\") on node \"crc\" DevicePath \"\"" Mar 11 13:18:42 crc kubenswrapper[4816]: I0311 13:18:42.582534 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 13:18:42 crc kubenswrapper[4816]: I0311 13:18:42.582531 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb5pd" event={"ID":"963d27c0-f203-4997-aa60-ac73d2a54cc0","Type":"ContainerDied","Data":"5d89076f0fdd1a586d2d1d9d12f836502df9b389006d64897e25f5fabea5fa22"} Mar 11 13:18:42 crc kubenswrapper[4816]: I0311 13:18:42.582610 4816 scope.go:117] "RemoveContainer" containerID="df036ebc1022629bd7df15b57ae8610b239015cc838a88645d459c84c864e336" Mar 11 13:18:42 crc kubenswrapper[4816]: I0311 13:18:42.610730 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qb5pd"] Mar 11 13:18:42 crc kubenswrapper[4816]: I0311 13:18:42.623679 4816 scope.go:117] "RemoveContainer" containerID="8c235b1052133359e398ac00a2eee490f7a085338a2901f71eac6e872bda6cbf" Mar 11 13:18:42 crc kubenswrapper[4816]: I0311 13:18:42.628357 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qb5pd"] Mar 11 13:18:42 crc kubenswrapper[4816]: I0311 13:18:42.663738 4816 scope.go:117] "RemoveContainer" containerID="6e58f19a27ae3010beb47e8be328d7c7ee7c8f14b5f34d2213706b6f25097290" Mar 11 13:18:44 crc kubenswrapper[4816]: I0311 13:18:44.147048 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" path="/var/lib/kubelet/pods/963d27c0-f203-4997-aa60-ac73d2a54cc0/volumes" Mar 11 13:18:50 crc kubenswrapper[4816]: I0311 13:18:50.367549 4816 scope.go:117] "RemoveContainer" containerID="148ded4a02efdc34a61cfc1e6b248706834d114bbcd8c2d3fc0a1082e7f112b8" Mar 11 13:19:09 crc kubenswrapper[4816]: I0311 13:19:09.515448 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:19:09 crc kubenswrapper[4816]: I0311 13:19:09.516111 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:19:23 crc kubenswrapper[4816]: I0311 13:19:23.992892 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-69hv5"] Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.005173 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-69hv5"] Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.143993 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7462073d-1852-4032-87bc-e0a4b973f92f" path="/var/lib/kubelet/pods/7462073d-1852-4032-87bc-e0a4b973f92f/volumes" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.152978 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-gjqlz"] Mar 11 13:19:24 crc kubenswrapper[4816]: E0311 13:19:24.158699 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="registry-server" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.158727 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="registry-server" Mar 11 13:19:24 crc kubenswrapper[4816]: E0311 13:19:24.158741 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="extract-utilities" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.158749 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="extract-utilities" Mar 11 13:19:24 crc kubenswrapper[4816]: E0311 13:19:24.158763 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="extract-content" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.158772 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="extract-content" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.158988 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="registry-server" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.160105 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-gjqlz"] Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.160198 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.171287 4816 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-zmgc9" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.171375 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.171432 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.171434 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.251841 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-crc-storage\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.251895 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dqpb\" (UniqueName: \"kubernetes.io/projected/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-kube-api-access-7dqpb\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.251930 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-node-mnt\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.354117 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-crc-storage\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.354766 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dqpb\" (UniqueName: \"kubernetes.io/projected/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-kube-api-access-7dqpb\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.354928 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-node-mnt\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.355362 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-node-mnt\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.355636 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-crc-storage\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.391287 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dqpb\" (UniqueName: \"kubernetes.io/projected/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-kube-api-access-7dqpb\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.487197 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.779234 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-gjqlz"] Mar 11 13:19:24 crc kubenswrapper[4816]: W0311 13:19:24.782001 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fdf8d9b_a464_4d2f_a2e5_a4854b7f9ab3.slice/crio-e9fc92572a812a6a0a93fb11d768dee3bd44fc98e9449d2ffae9276055dc4124 WatchSource:0}: Error finding container e9fc92572a812a6a0a93fb11d768dee3bd44fc98e9449d2ffae9276055dc4124: Status 404 returned error can't find the container with id e9fc92572a812a6a0a93fb11d768dee3bd44fc98e9449d2ffae9276055dc4124 Mar 11 13:19:25 crc kubenswrapper[4816]: I0311 13:19:25.034480 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gjqlz" event={"ID":"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3","Type":"ContainerStarted","Data":"e9fc92572a812a6a0a93fb11d768dee3bd44fc98e9449d2ffae9276055dc4124"} Mar 11 13:19:26 crc kubenswrapper[4816]: I0311 13:19:26.046156 4816 generic.go:334] "Generic (PLEG): container finished" podID="3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" containerID="3c47893cabbfc635edaea2ea48266ffc815a61e1e094878326c38fe6119ee1b9" exitCode=0 Mar 11 13:19:26 crc kubenswrapper[4816]: I0311 13:19:26.046237 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gjqlz" event={"ID":"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3","Type":"ContainerDied","Data":"3c47893cabbfc635edaea2ea48266ffc815a61e1e094878326c38fe6119ee1b9"} Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.420479 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.527470 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-crc-storage\") pod \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.527739 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-node-mnt\") pod \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.527844 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" (UID: "3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.528027 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dqpb\" (UniqueName: \"kubernetes.io/projected/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-kube-api-access-7dqpb\") pod \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.528456 4816 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.536913 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-kube-api-access-7dqpb" (OuterVolumeSpecName: "kube-api-access-7dqpb") pod "3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" (UID: "3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3"). InnerVolumeSpecName "kube-api-access-7dqpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.553708 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" (UID: "3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.629828 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dqpb\" (UniqueName: \"kubernetes.io/projected/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-kube-api-access-7dqpb\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.629872 4816 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:28 crc kubenswrapper[4816]: I0311 13:19:28.068638 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gjqlz" event={"ID":"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3","Type":"ContainerDied","Data":"e9fc92572a812a6a0a93fb11d768dee3bd44fc98e9449d2ffae9276055dc4124"} Mar 11 13:19:28 crc kubenswrapper[4816]: I0311 13:19:28.069128 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9fc92572a812a6a0a93fb11d768dee3bd44fc98e9449d2ffae9276055dc4124" Mar 11 13:19:28 crc kubenswrapper[4816]: I0311 13:19:28.068815 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:29 crc kubenswrapper[4816]: I0311 13:19:29.986775 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-gjqlz"] Mar 11 13:19:29 crc kubenswrapper[4816]: I0311 13:19:29.997206 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-gjqlz"] Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.174808 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" path="/var/lib/kubelet/pods/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3/volumes" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.176774 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-mjnc9"] Mar 11 13:19:30 crc kubenswrapper[4816]: E0311 13:19:30.178872 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" containerName="storage" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.178917 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" containerName="storage" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.179534 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" containerName="storage" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.180746 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.184981 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.185741 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.187631 4816 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-zmgc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.187649 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.193100 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-mjnc9"] Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.278491 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d30e4b47-6db7-45ec-b6e8-22a9e619d462-node-mnt\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.278577 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d30e4b47-6db7-45ec-b6e8-22a9e619d462-crc-storage\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.278613 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5jdz\" (UniqueName: \"kubernetes.io/projected/d30e4b47-6db7-45ec-b6e8-22a9e619d462-kube-api-access-q5jdz\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.380638 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d30e4b47-6db7-45ec-b6e8-22a9e619d462-node-mnt\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.381404 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d30e4b47-6db7-45ec-b6e8-22a9e619d462-crc-storage\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.381642 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5jdz\" (UniqueName: \"kubernetes.io/projected/d30e4b47-6db7-45ec-b6e8-22a9e619d462-kube-api-access-q5jdz\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.381304 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d30e4b47-6db7-45ec-b6e8-22a9e619d462-node-mnt\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.382720 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d30e4b47-6db7-45ec-b6e8-22a9e619d462-crc-storage\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.419333 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5jdz\" (UniqueName: \"kubernetes.io/projected/d30e4b47-6db7-45ec-b6e8-22a9e619d462-kube-api-access-q5jdz\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.512433 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.803570 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-mjnc9"] Mar 11 13:19:31 crc kubenswrapper[4816]: I0311 13:19:31.098698 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mjnc9" event={"ID":"d30e4b47-6db7-45ec-b6e8-22a9e619d462","Type":"ContainerStarted","Data":"a16f89b6d7d619cf99a106f26780c963377fc7bd5ef58748edc7d4f4741cc356"} Mar 11 13:19:32 crc kubenswrapper[4816]: I0311 13:19:32.108822 4816 generic.go:334] "Generic (PLEG): container finished" podID="d30e4b47-6db7-45ec-b6e8-22a9e619d462" containerID="d83f72f91a08d6b75ba22e2ebf0fc9900a5fbe8d91a5c626eec467d809c24f71" exitCode=0 Mar 11 13:19:32 crc kubenswrapper[4816]: I0311 13:19:32.108963 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mjnc9" event={"ID":"d30e4b47-6db7-45ec-b6e8-22a9e619d462","Type":"ContainerDied","Data":"d83f72f91a08d6b75ba22e2ebf0fc9900a5fbe8d91a5c626eec467d809c24f71"} Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.563324 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.633240 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d30e4b47-6db7-45ec-b6e8-22a9e619d462-crc-storage\") pod \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.633417 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5jdz\" (UniqueName: \"kubernetes.io/projected/d30e4b47-6db7-45ec-b6e8-22a9e619d462-kube-api-access-q5jdz\") pod \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.633499 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d30e4b47-6db7-45ec-b6e8-22a9e619d462-node-mnt\") pod \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.633679 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d30e4b47-6db7-45ec-b6e8-22a9e619d462-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "d30e4b47-6db7-45ec-b6e8-22a9e619d462" (UID: "d30e4b47-6db7-45ec-b6e8-22a9e619d462"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.633931 4816 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d30e4b47-6db7-45ec-b6e8-22a9e619d462-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.640422 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30e4b47-6db7-45ec-b6e8-22a9e619d462-kube-api-access-q5jdz" (OuterVolumeSpecName: "kube-api-access-q5jdz") pod "d30e4b47-6db7-45ec-b6e8-22a9e619d462" (UID: "d30e4b47-6db7-45ec-b6e8-22a9e619d462"). InnerVolumeSpecName "kube-api-access-q5jdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.658578 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30e4b47-6db7-45ec-b6e8-22a9e619d462-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "d30e4b47-6db7-45ec-b6e8-22a9e619d462" (UID: "d30e4b47-6db7-45ec-b6e8-22a9e619d462"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.735372 4816 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d30e4b47-6db7-45ec-b6e8-22a9e619d462-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.735428 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5jdz\" (UniqueName: \"kubernetes.io/projected/d30e4b47-6db7-45ec-b6e8-22a9e619d462-kube-api-access-q5jdz\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:34 crc kubenswrapper[4816]: I0311 13:19:34.142022 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:34 crc kubenswrapper[4816]: I0311 13:19:34.176143 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mjnc9" event={"ID":"d30e4b47-6db7-45ec-b6e8-22a9e619d462","Type":"ContainerDied","Data":"a16f89b6d7d619cf99a106f26780c963377fc7bd5ef58748edc7d4f4741cc356"} Mar 11 13:19:34 crc kubenswrapper[4816]: I0311 13:19:34.176199 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a16f89b6d7d619cf99a106f26780c963377fc7bd5ef58748edc7d4f4741cc356" Mar 11 13:19:39 crc kubenswrapper[4816]: I0311 13:19:39.515140 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:19:39 crc kubenswrapper[4816]: I0311 13:19:39.515943 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.425497 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dlh8d"] Mar 11 13:19:45 crc kubenswrapper[4816]: E0311 13:19:45.433703 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30e4b47-6db7-45ec-b6e8-22a9e619d462" containerName="storage" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.433735 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30e4b47-6db7-45ec-b6e8-22a9e619d462" containerName="storage" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.433989 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30e4b47-6db7-45ec-b6e8-22a9e619d462" containerName="storage" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.435509 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.447825 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlh8d"] Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.568021 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-utilities\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.568167 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6knc\" (UniqueName: \"kubernetes.io/projected/c131fc25-0347-4783-bb0e-51d87ef555ea-kube-api-access-s6knc\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.568209 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-catalog-content\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.669592 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6knc\" (UniqueName: \"kubernetes.io/projected/c131fc25-0347-4783-bb0e-51d87ef555ea-kube-api-access-s6knc\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.669679 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-catalog-content\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.669771 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-utilities\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.670435 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-catalog-content\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.670509 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-utilities\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.696089 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6knc\" (UniqueName: \"kubernetes.io/projected/c131fc25-0347-4783-bb0e-51d87ef555ea-kube-api-access-s6knc\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.814049 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:46 crc kubenswrapper[4816]: I0311 13:19:46.071224 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlh8d"] Mar 11 13:19:46 crc kubenswrapper[4816]: I0311 13:19:46.247573 4816 generic.go:334] "Generic (PLEG): container finished" podID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerID="8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741" exitCode=0 Mar 11 13:19:46 crc kubenswrapper[4816]: I0311 13:19:46.247622 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlh8d" event={"ID":"c131fc25-0347-4783-bb0e-51d87ef555ea","Type":"ContainerDied","Data":"8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741"} Mar 11 13:19:46 crc kubenswrapper[4816]: I0311 13:19:46.247677 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlh8d" event={"ID":"c131fc25-0347-4783-bb0e-51d87ef555ea","Type":"ContainerStarted","Data":"5fd9a757141accae25718c44ebcf77b36ba1f105d8089b8809bf7ab1041950c1"} Mar 11 13:19:48 crc kubenswrapper[4816]: I0311 13:19:48.268316 4816 generic.go:334] "Generic (PLEG): container finished" podID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerID="b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599" exitCode=0 Mar 11 13:19:48 crc kubenswrapper[4816]: I0311 13:19:48.268402 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlh8d" event={"ID":"c131fc25-0347-4783-bb0e-51d87ef555ea","Type":"ContainerDied","Data":"b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599"} Mar 11 13:19:49 crc kubenswrapper[4816]: I0311 13:19:49.280130 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlh8d" event={"ID":"c131fc25-0347-4783-bb0e-51d87ef555ea","Type":"ContainerStarted","Data":"7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021"} Mar 11 13:19:49 crc kubenswrapper[4816]: I0311 13:19:49.305002 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dlh8d" podStartSLOduration=1.695181204 podStartE2EDuration="4.304975043s" podCreationTimestamp="2026-03-11 13:19:45 +0000 UTC" firstStartedPulling="2026-03-11 13:19:46.249130781 +0000 UTC m=+4872.840394748" lastFinishedPulling="2026-03-11 13:19:48.85892457 +0000 UTC m=+4875.450188587" observedRunningTime="2026-03-11 13:19:49.30313067 +0000 UTC m=+4875.894394667" watchObservedRunningTime="2026-03-11 13:19:49.304975043 +0000 UTC m=+4875.896239030" Mar 11 13:19:50 crc kubenswrapper[4816]: I0311 13:19:50.471238 4816 scope.go:117] "RemoveContainer" containerID="0ee4f053b0c8963adb31e4e6ffaf9c7c100dafccbfa493c26f5254141c13917c" Mar 11 13:19:55 crc kubenswrapper[4816]: I0311 13:19:55.814597 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:55 crc kubenswrapper[4816]: I0311 13:19:55.816791 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:55 crc kubenswrapper[4816]: I0311 13:19:55.876308 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:56 crc kubenswrapper[4816]: I0311 13:19:56.391388 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:56 crc kubenswrapper[4816]: I0311 13:19:56.454226 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlh8d"] Mar 11 13:19:58 crc kubenswrapper[4816]: I0311 13:19:58.355762 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dlh8d" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="registry-server" containerID="cri-o://7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021" gracePeriod=2 Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.275180 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.367625 4816 generic.go:334] "Generic (PLEG): container finished" podID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerID="7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021" exitCode=0 Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.367677 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.367720 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlh8d" event={"ID":"c131fc25-0347-4783-bb0e-51d87ef555ea","Type":"ContainerDied","Data":"7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021"} Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.368518 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlh8d" event={"ID":"c131fc25-0347-4783-bb0e-51d87ef555ea","Type":"ContainerDied","Data":"5fd9a757141accae25718c44ebcf77b36ba1f105d8089b8809bf7ab1041950c1"} Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.368565 4816 scope.go:117] "RemoveContainer" containerID="7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.393591 4816 scope.go:117] "RemoveContainer" containerID="b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.396991 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-utilities\") pod \"c131fc25-0347-4783-bb0e-51d87ef555ea\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.397105 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6knc\" (UniqueName: \"kubernetes.io/projected/c131fc25-0347-4783-bb0e-51d87ef555ea-kube-api-access-s6knc\") pod \"c131fc25-0347-4783-bb0e-51d87ef555ea\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.397131 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-catalog-content\") pod \"c131fc25-0347-4783-bb0e-51d87ef555ea\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.398314 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-utilities" (OuterVolumeSpecName: "utilities") pod "c131fc25-0347-4783-bb0e-51d87ef555ea" (UID: "c131fc25-0347-4783-bb0e-51d87ef555ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.404771 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c131fc25-0347-4783-bb0e-51d87ef555ea-kube-api-access-s6knc" (OuterVolumeSpecName: "kube-api-access-s6knc") pod "c131fc25-0347-4783-bb0e-51d87ef555ea" (UID: "c131fc25-0347-4783-bb0e-51d87ef555ea"). InnerVolumeSpecName "kube-api-access-s6knc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.424045 4816 scope.go:117] "RemoveContainer" containerID="8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.437295 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c131fc25-0347-4783-bb0e-51d87ef555ea" (UID: "c131fc25-0347-4783-bb0e-51d87ef555ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.461025 4816 scope.go:117] "RemoveContainer" containerID="7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021" Mar 11 13:19:59 crc kubenswrapper[4816]: E0311 13:19:59.461827 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021\": container with ID starting with 7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021 not found: ID does not exist" containerID="7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.461880 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021"} err="failed to get container status \"7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021\": rpc error: code = NotFound desc = could not find container \"7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021\": container with ID starting with 7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021 not found: ID does not exist" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.461907 4816 scope.go:117] "RemoveContainer" containerID="b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599" Mar 11 13:19:59 crc kubenswrapper[4816]: E0311 13:19:59.462388 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599\": container with ID starting with b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599 not found: ID does not exist" containerID="b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.462587 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599"} err="failed to get container status \"b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599\": rpc error: code = NotFound desc = could not find container \"b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599\": container with ID starting with b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599 not found: ID does not exist" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.462623 4816 scope.go:117] "RemoveContainer" containerID="8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741" Mar 11 13:19:59 crc kubenswrapper[4816]: E0311 13:19:59.463019 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741\": container with ID starting with 8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741 not found: ID does not exist" containerID="8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.463073 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741"} err="failed to get container status \"8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741\": rpc error: code = NotFound desc = could not find container \"8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741\": container with ID starting with 8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741 not found: ID does not exist" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.498863 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.498894 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6knc\" (UniqueName: \"kubernetes.io/projected/c131fc25-0347-4783-bb0e-51d87ef555ea-kube-api-access-s6knc\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.498905 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.705351 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlh8d"] Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.711634 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlh8d"] Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.147391 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" path="/var/lib/kubelet/pods/c131fc25-0347-4783-bb0e-51d87ef555ea/volumes" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.148581 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553920-h6hh5"] Mar 11 13:20:00 crc kubenswrapper[4816]: E0311 13:20:00.148979 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="extract-utilities" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.149009 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="extract-utilities" Mar 11 13:20:00 crc kubenswrapper[4816]: E0311 13:20:00.149041 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="registry-server" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.149054 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="registry-server" Mar 11 13:20:00 crc kubenswrapper[4816]: E0311 13:20:00.149078 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="extract-content" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.149090 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="extract-content" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.149740 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="registry-server" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.150432 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553920-h6hh5"] Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.150552 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553920-h6hh5" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.152858 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.153079 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.155272 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.312340 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psxg9\" (UniqueName: \"kubernetes.io/projected/193e1468-2f5b-4e66-94f3-a7fc184c7e01-kube-api-access-psxg9\") pod \"auto-csr-approver-29553920-h6hh5\" (UID: \"193e1468-2f5b-4e66-94f3-a7fc184c7e01\") " pod="openshift-infra/auto-csr-approver-29553920-h6hh5" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.414128 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psxg9\" (UniqueName: \"kubernetes.io/projected/193e1468-2f5b-4e66-94f3-a7fc184c7e01-kube-api-access-psxg9\") pod \"auto-csr-approver-29553920-h6hh5\" (UID: \"193e1468-2f5b-4e66-94f3-a7fc184c7e01\") " pod="openshift-infra/auto-csr-approver-29553920-h6hh5" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.432611 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psxg9\" (UniqueName: \"kubernetes.io/projected/193e1468-2f5b-4e66-94f3-a7fc184c7e01-kube-api-access-psxg9\") pod \"auto-csr-approver-29553920-h6hh5\" (UID: \"193e1468-2f5b-4e66-94f3-a7fc184c7e01\") " pod="openshift-infra/auto-csr-approver-29553920-h6hh5" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.471722 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553920-h6hh5" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.765476 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553920-h6hh5"] Mar 11 13:20:00 crc kubenswrapper[4816]: W0311 13:20:00.766529 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod193e1468_2f5b_4e66_94f3_a7fc184c7e01.slice/crio-21a46c47a0bed8bd38e19f517c3c8cd2030d7463b8e1e69fa397c74be15882ee WatchSource:0}: Error finding container 21a46c47a0bed8bd38e19f517c3c8cd2030d7463b8e1e69fa397c74be15882ee: Status 404 returned error can't find the container with id 21a46c47a0bed8bd38e19f517c3c8cd2030d7463b8e1e69fa397c74be15882ee Mar 11 13:20:01 crc kubenswrapper[4816]: I0311 13:20:01.384698 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553920-h6hh5" event={"ID":"193e1468-2f5b-4e66-94f3-a7fc184c7e01","Type":"ContainerStarted","Data":"21a46c47a0bed8bd38e19f517c3c8cd2030d7463b8e1e69fa397c74be15882ee"} Mar 11 13:20:03 crc kubenswrapper[4816]: I0311 13:20:03.403311 4816 generic.go:334] "Generic (PLEG): container finished" podID="193e1468-2f5b-4e66-94f3-a7fc184c7e01" containerID="4a3cade9d3e8a7bb5a9e71032c96de36c1178b4f7d16ddbd543510f67b6be155" exitCode=0 Mar 11 13:20:03 crc kubenswrapper[4816]: I0311 13:20:03.403442 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553920-h6hh5" event={"ID":"193e1468-2f5b-4e66-94f3-a7fc184c7e01","Type":"ContainerDied","Data":"4a3cade9d3e8a7bb5a9e71032c96de36c1178b4f7d16ddbd543510f67b6be155"} Mar 11 13:20:04 crc kubenswrapper[4816]: I0311 13:20:04.762569 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553920-h6hh5" Mar 11 13:20:04 crc kubenswrapper[4816]: I0311 13:20:04.887126 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psxg9\" (UniqueName: \"kubernetes.io/projected/193e1468-2f5b-4e66-94f3-a7fc184c7e01-kube-api-access-psxg9\") pod \"193e1468-2f5b-4e66-94f3-a7fc184c7e01\" (UID: \"193e1468-2f5b-4e66-94f3-a7fc184c7e01\") " Mar 11 13:20:04 crc kubenswrapper[4816]: I0311 13:20:04.896349 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/193e1468-2f5b-4e66-94f3-a7fc184c7e01-kube-api-access-psxg9" (OuterVolumeSpecName: "kube-api-access-psxg9") pod "193e1468-2f5b-4e66-94f3-a7fc184c7e01" (UID: "193e1468-2f5b-4e66-94f3-a7fc184c7e01"). InnerVolumeSpecName "kube-api-access-psxg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:20:04 crc kubenswrapper[4816]: I0311 13:20:04.989176 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psxg9\" (UniqueName: \"kubernetes.io/projected/193e1468-2f5b-4e66-94f3-a7fc184c7e01-kube-api-access-psxg9\") on node \"crc\" DevicePath \"\"" Mar 11 13:20:05 crc kubenswrapper[4816]: I0311 13:20:05.427411 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553920-h6hh5" event={"ID":"193e1468-2f5b-4e66-94f3-a7fc184c7e01","Type":"ContainerDied","Data":"21a46c47a0bed8bd38e19f517c3c8cd2030d7463b8e1e69fa397c74be15882ee"} Mar 11 13:20:05 crc kubenswrapper[4816]: I0311 13:20:05.427507 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21a46c47a0bed8bd38e19f517c3c8cd2030d7463b8e1e69fa397c74be15882ee" Mar 11 13:20:05 crc kubenswrapper[4816]: I0311 13:20:05.427520 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553920-h6hh5" Mar 11 13:20:05 crc kubenswrapper[4816]: I0311 13:20:05.856022 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553914-vtpr2"] Mar 11 13:20:05 crc kubenswrapper[4816]: I0311 13:20:05.862592 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553914-vtpr2"] Mar 11 13:20:06 crc kubenswrapper[4816]: I0311 13:20:06.146919 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b79556-cf6a-450f-9214-70d0854dc630" path="/var/lib/kubelet/pods/32b79556-cf6a-450f-9214-70d0854dc630/volumes" Mar 11 13:20:09 crc kubenswrapper[4816]: I0311 13:20:09.514932 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:20:09 crc kubenswrapper[4816]: I0311 13:20:09.515676 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:20:09 crc kubenswrapper[4816]: I0311 13:20:09.515767 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 13:20:09 crc kubenswrapper[4816]: I0311 13:20:09.516874 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ddd6136328dc7ec62752abe3735d43f3f986aeada7e2653f4b4a88d5e086c6c4"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 13:20:09 crc kubenswrapper[4816]: I0311 13:20:09.516986 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://ddd6136328dc7ec62752abe3735d43f3f986aeada7e2653f4b4a88d5e086c6c4" gracePeriod=600 Mar 11 13:20:10 crc kubenswrapper[4816]: I0311 13:20:10.473286 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="ddd6136328dc7ec62752abe3735d43f3f986aeada7e2653f4b4a88d5e086c6c4" exitCode=0 Mar 11 13:20:10 crc kubenswrapper[4816]: I0311 13:20:10.473326 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"ddd6136328dc7ec62752abe3735d43f3f986aeada7e2653f4b4a88d5e086c6c4"} Mar 11 13:20:10 crc kubenswrapper[4816]: I0311 13:20:10.473356 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:20:11 crc kubenswrapper[4816]: I0311 13:20:11.483608 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911"} Mar 11 13:20:50 crc kubenswrapper[4816]: I0311 13:20:50.541886 4816 scope.go:117] "RemoveContainer" containerID="8e7758cfa0d68340bf0bfe400a0bcdda434a161dca369cd6a56c8194d33e640d" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.170896 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553922-l2chb"] Mar 11 13:22:00 crc kubenswrapper[4816]: E0311 13:22:00.173441 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="193e1468-2f5b-4e66-94f3-a7fc184c7e01" containerName="oc" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.173474 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="193e1468-2f5b-4e66-94f3-a7fc184c7e01" containerName="oc" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.173776 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="193e1468-2f5b-4e66-94f3-a7fc184c7e01" containerName="oc" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.174541 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553922-l2chb" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.177082 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.178041 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.179411 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.185034 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553922-l2chb"] Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.343479 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkx74\" (UniqueName: \"kubernetes.io/projected/a09c6fad-26f7-4ea2-84fc-5d2efb86fd02-kube-api-access-dkx74\") pod \"auto-csr-approver-29553922-l2chb\" (UID: \"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02\") " pod="openshift-infra/auto-csr-approver-29553922-l2chb" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.444889 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkx74\" (UniqueName: \"kubernetes.io/projected/a09c6fad-26f7-4ea2-84fc-5d2efb86fd02-kube-api-access-dkx74\") pod \"auto-csr-approver-29553922-l2chb\" (UID: \"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02\") " pod="openshift-infra/auto-csr-approver-29553922-l2chb" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.475359 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkx74\" (UniqueName: \"kubernetes.io/projected/a09c6fad-26f7-4ea2-84fc-5d2efb86fd02-kube-api-access-dkx74\") pod \"auto-csr-approver-29553922-l2chb\" (UID: \"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02\") " pod="openshift-infra/auto-csr-approver-29553922-l2chb" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.504342 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553922-l2chb" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.777581 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553922-l2chb"] Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.790414 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 13:22:01 crc kubenswrapper[4816]: I0311 13:22:01.481755 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553922-l2chb" event={"ID":"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02","Type":"ContainerStarted","Data":"9cef72375d217d58a4f49a8becffa8fc99ece346d358c56e97b42d8d61f1a2d3"} Mar 11 13:22:02 crc kubenswrapper[4816]: I0311 13:22:02.494914 4816 generic.go:334] "Generic (PLEG): container finished" podID="a09c6fad-26f7-4ea2-84fc-5d2efb86fd02" containerID="5610d0163da9f92dcf1f4addb326b68bb7bee62775e25ffcf227b46aacd6327b" exitCode=0 Mar 11 13:22:02 crc kubenswrapper[4816]: I0311 13:22:02.495029 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553922-l2chb" event={"ID":"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02","Type":"ContainerDied","Data":"5610d0163da9f92dcf1f4addb326b68bb7bee62775e25ffcf227b46aacd6327b"} Mar 11 13:22:03 crc kubenswrapper[4816]: I0311 13:22:03.999370 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553922-l2chb" Mar 11 13:22:04 crc kubenswrapper[4816]: I0311 13:22:04.096620 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkx74\" (UniqueName: \"kubernetes.io/projected/a09c6fad-26f7-4ea2-84fc-5d2efb86fd02-kube-api-access-dkx74\") pod \"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02\" (UID: \"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02\") " Mar 11 13:22:04 crc kubenswrapper[4816]: I0311 13:22:04.104797 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a09c6fad-26f7-4ea2-84fc-5d2efb86fd02-kube-api-access-dkx74" (OuterVolumeSpecName: "kube-api-access-dkx74") pod "a09c6fad-26f7-4ea2-84fc-5d2efb86fd02" (UID: "a09c6fad-26f7-4ea2-84fc-5d2efb86fd02"). InnerVolumeSpecName "kube-api-access-dkx74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:22:04 crc kubenswrapper[4816]: I0311 13:22:04.198794 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkx74\" (UniqueName: \"kubernetes.io/projected/a09c6fad-26f7-4ea2-84fc-5d2efb86fd02-kube-api-access-dkx74\") on node \"crc\" DevicePath \"\"" Mar 11 13:22:04 crc kubenswrapper[4816]: I0311 13:22:04.521740 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553922-l2chb" event={"ID":"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02","Type":"ContainerDied","Data":"9cef72375d217d58a4f49a8becffa8fc99ece346d358c56e97b42d8d61f1a2d3"} Mar 11 13:22:04 crc kubenswrapper[4816]: I0311 13:22:04.521781 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cef72375d217d58a4f49a8becffa8fc99ece346d358c56e97b42d8d61f1a2d3" Mar 11 13:22:04 crc kubenswrapper[4816]: I0311 13:22:04.521810 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553922-l2chb" Mar 11 13:22:05 crc kubenswrapper[4816]: I0311 13:22:05.081013 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553916-n8r8p"] Mar 11 13:22:05 crc kubenswrapper[4816]: I0311 13:22:05.088236 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553916-n8r8p"] Mar 11 13:22:06 crc kubenswrapper[4816]: I0311 13:22:06.146454 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae7a49f0-ca01-4ad5-a353-5ac125523d95" path="/var/lib/kubelet/pods/ae7a49f0-ca01-4ad5-a353-5ac125523d95/volumes" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.008214 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rbcl7/must-gather-k8xh5"] Mar 11 13:22:20 crc kubenswrapper[4816]: E0311 13:22:20.009123 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09c6fad-26f7-4ea2-84fc-5d2efb86fd02" containerName="oc" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.009139 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09c6fad-26f7-4ea2-84fc-5d2efb86fd02" containerName="oc" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.009400 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a09c6fad-26f7-4ea2-84fc-5d2efb86fd02" containerName="oc" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.010314 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.012217 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rbcl7"/"kube-root-ca.crt" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.014459 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rbcl7"/"openshift-service-ca.crt" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.027115 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rbcl7/must-gather-k8xh5"] Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.048356 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240c7704-66e9-4d5b-9b4f-cf8a80365c26-must-gather-output\") pod \"must-gather-k8xh5\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.048407 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4wc7\" (UniqueName: \"kubernetes.io/projected/240c7704-66e9-4d5b-9b4f-cf8a80365c26-kube-api-access-g4wc7\") pod \"must-gather-k8xh5\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.150034 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240c7704-66e9-4d5b-9b4f-cf8a80365c26-must-gather-output\") pod \"must-gather-k8xh5\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.150090 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4wc7\" (UniqueName: \"kubernetes.io/projected/240c7704-66e9-4d5b-9b4f-cf8a80365c26-kube-api-access-g4wc7\") pod \"must-gather-k8xh5\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.150650 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240c7704-66e9-4d5b-9b4f-cf8a80365c26-must-gather-output\") pod \"must-gather-k8xh5\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.179320 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4wc7\" (UniqueName: \"kubernetes.io/projected/240c7704-66e9-4d5b-9b4f-cf8a80365c26-kube-api-access-g4wc7\") pod \"must-gather-k8xh5\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.329341 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.754626 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rbcl7/must-gather-k8xh5"] Mar 11 13:22:21 crc kubenswrapper[4816]: I0311 13:22:21.661212 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" event={"ID":"240c7704-66e9-4d5b-9b4f-cf8a80365c26","Type":"ContainerStarted","Data":"3114da44a220bd5bf16e3c17711dc438c13c8e76a81c57bfcca181e77302faaa"} Mar 11 13:22:27 crc kubenswrapper[4816]: I0311 13:22:27.729156 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" event={"ID":"240c7704-66e9-4d5b-9b4f-cf8a80365c26","Type":"ContainerStarted","Data":"54930770a8edcb6e0930ad6af1934aac20a12f0e3525f98e64859acac70909c5"} Mar 11 13:22:27 crc kubenswrapper[4816]: I0311 13:22:27.729737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" event={"ID":"240c7704-66e9-4d5b-9b4f-cf8a80365c26","Type":"ContainerStarted","Data":"275f5d5dbbc9a09455fcf3424925e09fcf25e8e9a31a9d4c2991fb94c2921996"} Mar 11 13:22:27 crc kubenswrapper[4816]: I0311 13:22:27.763774 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" podStartSLOduration=2.943973745 podStartE2EDuration="8.763752964s" podCreationTimestamp="2026-03-11 13:22:19 +0000 UTC" firstStartedPulling="2026-03-11 13:22:20.767890818 +0000 UTC m=+5027.359154785" lastFinishedPulling="2026-03-11 13:22:26.587670037 +0000 UTC m=+5033.178934004" observedRunningTime="2026-03-11 13:22:27.753533161 +0000 UTC m=+5034.344797148" watchObservedRunningTime="2026-03-11 13:22:27.763752964 +0000 UTC m=+5034.355016951" Mar 11 13:22:39 crc kubenswrapper[4816]: I0311 13:22:39.515469 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:22:39 crc kubenswrapper[4816]: I0311 13:22:39.515972 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:22:50 crc kubenswrapper[4816]: I0311 13:22:50.666308 4816 scope.go:117] "RemoveContainer" containerID="e4b94bbef2f14a1e765d933fe579ccf92b49db99e68b93650802fa89e27f09ad" Mar 11 13:23:09 crc kubenswrapper[4816]: I0311 13:23:09.515009 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:23:09 crc kubenswrapper[4816]: I0311 13:23:09.515498 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:23:32 crc kubenswrapper[4816]: I0311 13:23:32.636276 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-fjkn4_72237264-5d09-40bd-ba83-f30b76790cb6/manager/0.log" Mar 11 13:23:32 crc kubenswrapper[4816]: I0311 13:23:32.833928 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp_b89bbc79-4a51-434d-916c-bf02869be9cb/util/0.log" Mar 11 13:23:32 crc kubenswrapper[4816]: I0311 13:23:32.990766 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp_b89bbc79-4a51-434d-916c-bf02869be9cb/util/0.log" Mar 11 13:23:33 crc kubenswrapper[4816]: I0311 13:23:33.068491 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp_b89bbc79-4a51-434d-916c-bf02869be9cb/pull/0.log" Mar 11 13:23:33 crc kubenswrapper[4816]: I0311 13:23:33.196714 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp_b89bbc79-4a51-434d-916c-bf02869be9cb/pull/0.log" Mar 11 13:23:33 crc kubenswrapper[4816]: I0311 13:23:33.356301 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp_b89bbc79-4a51-434d-916c-bf02869be9cb/util/0.log" Mar 11 13:23:33 crc kubenswrapper[4816]: I0311 13:23:33.378539 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp_b89bbc79-4a51-434d-916c-bf02869be9cb/pull/0.log" Mar 11 13:23:33 crc kubenswrapper[4816]: I0311 13:23:33.659068 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp_b89bbc79-4a51-434d-916c-bf02869be9cb/extract/0.log" Mar 11 13:23:33 crc kubenswrapper[4816]: I0311 13:23:33.914936 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-px2wm_c28c6622-633e-4e76-9c9a-eb732531fa1a/manager/0.log" Mar 11 13:23:34 crc kubenswrapper[4816]: I0311 13:23:34.014880 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-66ctj_b941b0f1-4a8f-4517-af46-cc77892fe3d9/manager/0.log" Mar 11 13:23:34 crc kubenswrapper[4816]: I0311 13:23:34.198792 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-8v46x_9e0c8832-9c20-44a9-933c-4a7fff032367/manager/0.log" Mar 11 13:23:34 crc kubenswrapper[4816]: I0311 13:23:34.705075 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-874hd_f37fb9b3-7b07-4188-b9ea-facfa5e945f0/manager/0.log" Mar 11 13:23:34 crc kubenswrapper[4816]: I0311 13:23:34.729704 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-hzd9q_a605e964-6e3c-4639-95d5-908f5d0ab7ef/manager/0.log" Mar 11 13:23:34 crc kubenswrapper[4816]: I0311 13:23:34.819385 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-g8cg2_6311ca5f-6f4c-4768-ae5e-75128be7f589/manager/0.log" Mar 11 13:23:35 crc kubenswrapper[4816]: I0311 13:23:35.264902 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-zczdq_73e00d02-6599-4cab-a32b-8fe96b82951a/manager/0.log" Mar 11 13:23:35 crc kubenswrapper[4816]: I0311 13:23:35.493131 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-bl9hm_bcfe1f90-2b5f-43b7-b798-0bad62ec53b2/manager/0.log" Mar 11 13:23:35 crc kubenswrapper[4816]: I0311 13:23:35.519521 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-wnsst_5d318732-8194-49eb-a2a3-c5b13ce843a7/manager/0.log" Mar 11 13:23:35 crc kubenswrapper[4816]: I0311 13:23:35.838114 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-h2vmc_4d4c74ff-52a2-4426-bd06-daa6e9b1a832/manager/0.log" Mar 11 13:23:36 crc kubenswrapper[4816]: I0311 13:23:36.336700 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-56fsw_b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5/manager/0.log" Mar 11 13:23:36 crc kubenswrapper[4816]: I0311 13:23:36.495580 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-rxhkb_d1702062-37ba-43c0-becb-005e11f457a0/manager/0.log" Mar 11 13:23:36 crc kubenswrapper[4816]: I0311 13:23:36.597132 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l_78a7aebd-70a2-4608-a669-aea496cb6186/manager/0.log" Mar 11 13:23:36 crc kubenswrapper[4816]: I0311 13:23:36.903773 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-65b9994cf8-zz7rl_0347df32-1ff0-463e-b073-077df8f41595/operator/0.log" Mar 11 13:23:37 crc kubenswrapper[4816]: I0311 13:23:37.197541 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zsrdm_4ed28d20-6f1f-4bb8-853d-284003a6b922/registry-server/0.log" Mar 11 13:23:37 crc kubenswrapper[4816]: I0311 13:23:37.457563 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-rr62t_6bbceab2-fe2b-4693-867d-aa2a51261611/manager/0.log" Mar 11 13:23:37 crc kubenswrapper[4816]: I0311 13:23:37.549339 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-h7kgb_e04ad395-8120-4c57-8575-611fa438e8fb/manager/0.log" Mar 11 13:23:37 crc kubenswrapper[4816]: I0311 13:23:37.643552 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-dnqpf_8e810ef6-d3f5-4133-bce2-234df32b3d10/operator/0.log" Mar 11 13:23:37 crc kubenswrapper[4816]: I0311 13:23:37.845408 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-426qz_d7932403-615f-44e4-b195-4a83c19787ba/manager/0.log" Mar 11 13:23:37 crc kubenswrapper[4816]: I0311 13:23:37.890560 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7795b46f77-pt8n6_5f4b0b09-5704-432a-9cd4-82a296f3c467/manager/0.log" Mar 11 13:23:38 crc kubenswrapper[4816]: I0311 13:23:38.065278 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-7ldx8_0ddf91ff-6d91-4213-8032-05f80408063d/manager/0.log" Mar 11 13:23:38 crc kubenswrapper[4816]: I0311 13:23:38.073368 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-k2rnj_282f8f05-9a84-4bb4-a122-ba8806324ca3/manager/0.log" Mar 11 13:23:38 crc kubenswrapper[4816]: I0311 13:23:38.248287 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-kx9nz_4126be7d-7ca8-4e68-94d4-ea21644fbd85/manager/0.log" Mar 11 13:23:39 crc kubenswrapper[4816]: I0311 13:23:39.514519 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:23:39 crc kubenswrapper[4816]: I0311 13:23:39.514834 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:23:39 crc kubenswrapper[4816]: I0311 13:23:39.514876 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 13:23:39 crc kubenswrapper[4816]: I0311 13:23:39.515445 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 13:23:39 crc kubenswrapper[4816]: I0311 13:23:39.515490 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" gracePeriod=600 Mar 11 13:23:39 crc kubenswrapper[4816]: E0311 13:23:39.634999 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:23:40 crc kubenswrapper[4816]: I0311 13:23:40.276239 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" exitCode=0 Mar 11 13:23:40 crc kubenswrapper[4816]: I0311 13:23:40.276281 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911"} Mar 11 13:23:40 crc kubenswrapper[4816]: I0311 13:23:40.276341 4816 scope.go:117] "RemoveContainer" containerID="ddd6136328dc7ec62752abe3735d43f3f986aeada7e2653f4b4a88d5e086c6c4" Mar 11 13:23:40 crc kubenswrapper[4816]: I0311 13:23:40.276862 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:23:40 crc kubenswrapper[4816]: E0311 13:23:40.277068 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:23:42 crc kubenswrapper[4816]: I0311 13:23:42.277659 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-rb228_a8133b64-eb11-43ad-bf6e-a278af0ff466/manager/0.log" Mar 11 13:23:55 crc kubenswrapper[4816]: I0311 13:23:55.131371 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:23:55 crc kubenswrapper[4816]: E0311 13:23:55.132111 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:23:59 crc kubenswrapper[4816]: I0311 13:23:59.487004 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ksjm4_db49f265-44d3-468b-8e2f-2246b02b57be/control-plane-machine-set-operator/0.log" Mar 11 13:23:59 crc kubenswrapper[4816]: I0311 13:23:59.576063 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t5t6b_cf7eaa86-2d32-4321-9016-e785320de3e2/kube-rbac-proxy/0.log" Mar 11 13:23:59 crc kubenswrapper[4816]: I0311 13:23:59.630348 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t5t6b_cf7eaa86-2d32-4321-9016-e785320de3e2/machine-api-operator/0.log" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.149471 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553924-rvhwv"] Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.150340 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553924-rvhwv" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.153040 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.153939 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.157220 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.170527 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553924-rvhwv"] Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.319277 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbrmw\" (UniqueName: \"kubernetes.io/projected/c5e898cc-ff3f-4b4e-8fd4-4a85d3934314-kube-api-access-lbrmw\") pod \"auto-csr-approver-29553924-rvhwv\" (UID: \"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314\") " pod="openshift-infra/auto-csr-approver-29553924-rvhwv" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.420665 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbrmw\" (UniqueName: \"kubernetes.io/projected/c5e898cc-ff3f-4b4e-8fd4-4a85d3934314-kube-api-access-lbrmw\") pod \"auto-csr-approver-29553924-rvhwv\" (UID: \"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314\") " pod="openshift-infra/auto-csr-approver-29553924-rvhwv" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.439458 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbrmw\" (UniqueName: \"kubernetes.io/projected/c5e898cc-ff3f-4b4e-8fd4-4a85d3934314-kube-api-access-lbrmw\") pod \"auto-csr-approver-29553924-rvhwv\" (UID: \"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314\") " pod="openshift-infra/auto-csr-approver-29553924-rvhwv" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.473651 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553924-rvhwv" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.902594 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553924-rvhwv"] Mar 11 13:24:01 crc kubenswrapper[4816]: I0311 13:24:01.436013 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553924-rvhwv" event={"ID":"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314","Type":"ContainerStarted","Data":"ce200279374d5c25974cc9bda3280802f87c35f6ee351a2d31cadbe9da998a81"} Mar 11 13:24:03 crc kubenswrapper[4816]: I0311 13:24:03.458616 4816 generic.go:334] "Generic (PLEG): container finished" podID="c5e898cc-ff3f-4b4e-8fd4-4a85d3934314" containerID="289319f8c74a3f6941e1372e90484c85b50d9f435ddf8b7c0a56ed2e3b71fb7c" exitCode=0 Mar 11 13:24:03 crc kubenswrapper[4816]: I0311 13:24:03.458818 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553924-rvhwv" event={"ID":"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314","Type":"ContainerDied","Data":"289319f8c74a3f6941e1372e90484c85b50d9f435ddf8b7c0a56ed2e3b71fb7c"} Mar 11 13:24:04 crc kubenswrapper[4816]: I0311 13:24:04.854107 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553924-rvhwv" Mar 11 13:24:04 crc kubenswrapper[4816]: I0311 13:24:04.998485 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbrmw\" (UniqueName: \"kubernetes.io/projected/c5e898cc-ff3f-4b4e-8fd4-4a85d3934314-kube-api-access-lbrmw\") pod \"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314\" (UID: \"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314\") " Mar 11 13:24:05 crc kubenswrapper[4816]: I0311 13:24:05.431190 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e898cc-ff3f-4b4e-8fd4-4a85d3934314-kube-api-access-lbrmw" (OuterVolumeSpecName: "kube-api-access-lbrmw") pod "c5e898cc-ff3f-4b4e-8fd4-4a85d3934314" (UID: "c5e898cc-ff3f-4b4e-8fd4-4a85d3934314"). InnerVolumeSpecName "kube-api-access-lbrmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:24:05 crc kubenswrapper[4816]: I0311 13:24:05.484889 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553924-rvhwv" event={"ID":"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314","Type":"ContainerDied","Data":"ce200279374d5c25974cc9bda3280802f87c35f6ee351a2d31cadbe9da998a81"} Mar 11 13:24:05 crc kubenswrapper[4816]: I0311 13:24:05.484947 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce200279374d5c25974cc9bda3280802f87c35f6ee351a2d31cadbe9da998a81" Mar 11 13:24:05 crc kubenswrapper[4816]: I0311 13:24:05.485011 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553924-rvhwv" Mar 11 13:24:05 crc kubenswrapper[4816]: I0311 13:24:05.514535 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbrmw\" (UniqueName: \"kubernetes.io/projected/c5e898cc-ff3f-4b4e-8fd4-4a85d3934314-kube-api-access-lbrmw\") on node \"crc\" DevicePath \"\"" Mar 11 13:24:05 crc kubenswrapper[4816]: I0311 13:24:05.937961 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553918-7rzs7"] Mar 11 13:24:05 crc kubenswrapper[4816]: I0311 13:24:05.944655 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553918-7rzs7"] Mar 11 13:24:06 crc kubenswrapper[4816]: I0311 13:24:06.142630 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d95b7b2b-acc3-47bd-b762-29e39ca68f93" path="/var/lib/kubelet/pods/d95b7b2b-acc3-47bd-b762-29e39ca68f93/volumes" Mar 11 13:24:10 crc kubenswrapper[4816]: I0311 13:24:10.131059 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:24:10 crc kubenswrapper[4816]: E0311 13:24:10.131652 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:24:14 crc kubenswrapper[4816]: I0311 13:24:14.708899 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-62cp5_e50b3f6b-4679-4337-a9cf-478aa2fb5800/cert-manager-controller/0.log" Mar 11 13:24:14 crc kubenswrapper[4816]: I0311 13:24:14.905105 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-fgzw7_f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8/cert-manager-cainjector/0.log" Mar 11 13:24:14 crc kubenswrapper[4816]: I0311 13:24:14.909805 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-2jk7k_0a41e6b9-3b80-4eed-a8db-65aa010f449d/cert-manager-webhook/0.log" Mar 11 13:24:22 crc kubenswrapper[4816]: I0311 13:24:22.131381 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:24:22 crc kubenswrapper[4816]: E0311 13:24:22.132148 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:24:29 crc kubenswrapper[4816]: I0311 13:24:29.075288 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-px2gk_a822f6ee-e723-4f64-b4f6-c948dc948359/nmstate-console-plugin/0.log" Mar 11 13:24:29 crc kubenswrapper[4816]: I0311 13:24:29.252900 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-47rs2_fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9/nmstate-handler/0.log" Mar 11 13:24:29 crc kubenswrapper[4816]: I0311 13:24:29.304637 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-2snpd_7fb0dcd0-9411-49d6-a997-79d2099b2462/kube-rbac-proxy/0.log" Mar 11 13:24:29 crc kubenswrapper[4816]: I0311 13:24:29.347771 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-2snpd_7fb0dcd0-9411-49d6-a997-79d2099b2462/nmstate-metrics/0.log" Mar 11 13:24:29 crc kubenswrapper[4816]: I0311 13:24:29.943478 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-g59xq_c1f09ebe-c0e1-415c-9ea9-42fc42240e94/nmstate-operator/0.log" Mar 11 13:24:29 crc kubenswrapper[4816]: I0311 13:24:29.992016 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-xq48v_1b664fad-a0fa-4442-bed2-3316eafbb78c/nmstate-webhook/0.log" Mar 11 13:24:37 crc kubenswrapper[4816]: I0311 13:24:37.129904 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:24:37 crc kubenswrapper[4816]: E0311 13:24:37.130578 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:24:50 crc kubenswrapper[4816]: I0311 13:24:50.130826 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:24:50 crc kubenswrapper[4816]: E0311 13:24:50.131800 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:24:50 crc kubenswrapper[4816]: I0311 13:24:50.771797 4816 scope.go:117] "RemoveContainer" containerID="740acfe6fc04d23ba8749fd0de9541e5bd0ee02db427a2bd65a7b93925e05ec4" Mar 11 13:25:00 crc kubenswrapper[4816]: I0311 13:25:00.365798 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-srnjf_2af0656a-169d-42fe-8efb-5258bc56af56/kube-rbac-proxy/0.log" Mar 11 13:25:00 crc kubenswrapper[4816]: I0311 13:25:00.586699 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-frr-files/0.log" Mar 11 13:25:00 crc kubenswrapper[4816]: I0311 13:25:00.760556 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-srnjf_2af0656a-169d-42fe-8efb-5258bc56af56/controller/0.log" Mar 11 13:25:00 crc kubenswrapper[4816]: I0311 13:25:00.802488 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-frr-files/0.log" Mar 11 13:25:00 crc kubenswrapper[4816]: I0311 13:25:00.817843 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-reloader/0.log" Mar 11 13:25:00 crc kubenswrapper[4816]: I0311 13:25:00.832672 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-metrics/0.log" Mar 11 13:25:00 crc kubenswrapper[4816]: I0311 13:25:00.964260 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-reloader/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.116911 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-reloader/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.117100 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-frr-files/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.164395 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-metrics/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.176607 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-metrics/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.328718 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-frr-files/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.346831 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-metrics/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.367787 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-reloader/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.395217 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/controller/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.537749 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/frr-metrics/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.576956 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/kube-rbac-proxy/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.596264 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/kube-rbac-proxy-frr/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.707378 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/reloader/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.853380 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-h8scg_6512814f-09cf-4b97-a1d6-ec99bcbf1525/frr-k8s-webhook-server/0.log" Mar 11 13:25:02 crc kubenswrapper[4816]: I0311 13:25:02.030546 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5679b59769-8stwg_7f7c9c4d-3a3f-4524-8964-8a99f24c2786/manager/0.log" Mar 11 13:25:02 crc kubenswrapper[4816]: I0311 13:25:02.127598 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-96bb59846-7z5mz_72342d10-d8c0-4f04-9554-e57c84d77653/webhook-server/0.log" Mar 11 13:25:02 crc kubenswrapper[4816]: I0311 13:25:02.130382 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:25:02 crc kubenswrapper[4816]: E0311 13:25:02.130629 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:25:02 crc kubenswrapper[4816]: I0311 13:25:02.271074 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wqwrt_43ec0f0d-8425-4dc4-9aa2-f1f85a26548c/kube-rbac-proxy/0.log" Mar 11 13:25:02 crc kubenswrapper[4816]: I0311 13:25:02.712101 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wqwrt_43ec0f0d-8425-4dc4-9aa2-f1f85a26548c/speaker/0.log" Mar 11 13:25:03 crc kubenswrapper[4816]: I0311 13:25:03.004554 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/frr/0.log" Mar 11 13:25:13 crc kubenswrapper[4816]: I0311 13:25:13.129919 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:25:13 crc kubenswrapper[4816]: E0311 13:25:13.130682 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.319340 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp_2a03942f-8b0e-4041-8843-ad5e6cedc6b0/util/0.log" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.515059 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp_2a03942f-8b0e-4041-8843-ad5e6cedc6b0/pull/0.log" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.547105 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp_2a03942f-8b0e-4041-8843-ad5e6cedc6b0/util/0.log" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.564211 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp_2a03942f-8b0e-4041-8843-ad5e6cedc6b0/pull/0.log" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.724212 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp_2a03942f-8b0e-4041-8843-ad5e6cedc6b0/extract/0.log" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.725034 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp_2a03942f-8b0e-4041-8843-ad5e6cedc6b0/pull/0.log" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.743216 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp_2a03942f-8b0e-4041-8843-ad5e6cedc6b0/util/0.log" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.871460 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8_5dff60f3-3acf-4dfb-9098-917736f61c0c/util/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.028615 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8_5dff60f3-3acf-4dfb-9098-917736f61c0c/util/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.032241 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8_5dff60f3-3acf-4dfb-9098-917736f61c0c/pull/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.055451 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8_5dff60f3-3acf-4dfb-9098-917736f61c0c/pull/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.521880 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8_5dff60f3-3acf-4dfb-9098-917736f61c0c/pull/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.570181 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8_5dff60f3-3acf-4dfb-9098-917736f61c0c/extract/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.591340 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8_5dff60f3-3acf-4dfb-9098-917736f61c0c/util/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.710462 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2_f9c6bbc7-62af-4c3a-ac05-1897b9f00080/util/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.886457 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2_f9c6bbc7-62af-4c3a-ac05-1897b9f00080/pull/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.939094 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2_f9c6bbc7-62af-4c3a-ac05-1897b9f00080/util/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.957148 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2_f9c6bbc7-62af-4c3a-ac05-1897b9f00080/pull/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.115030 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2_f9c6bbc7-62af-4c3a-ac05-1897b9f00080/util/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.124105 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2_f9c6bbc7-62af-4c3a-ac05-1897b9f00080/pull/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.137676 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2_f9c6bbc7-62af-4c3a-ac05-1897b9f00080/extract/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.273435 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-82vz9_2b81c3bf-499d-48bd-869b-671fefa1ba81/extract-utilities/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.469357 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-82vz9_2b81c3bf-499d-48bd-869b-671fefa1ba81/extract-content/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.493023 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-82vz9_2b81c3bf-499d-48bd-869b-671fefa1ba81/extract-utilities/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.503870 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-82vz9_2b81c3bf-499d-48bd-869b-671fefa1ba81/extract-content/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.938641 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-82vz9_2b81c3bf-499d-48bd-869b-671fefa1ba81/extract-content/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.993011 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-82vz9_2b81c3bf-499d-48bd-869b-671fefa1ba81/extract-utilities/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.161613 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2l2xd_a13e0873-9c6c-46d7-b0bf-4ef50c40a918/extract-utilities/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.323108 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-82vz9_2b81c3bf-499d-48bd-869b-671fefa1ba81/registry-server/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.353174 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2l2xd_a13e0873-9c6c-46d7-b0bf-4ef50c40a918/extract-content/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.395498 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2l2xd_a13e0873-9c6c-46d7-b0bf-4ef50c40a918/extract-content/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.397609 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2l2xd_a13e0873-9c6c-46d7-b0bf-4ef50c40a918/extract-utilities/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.554960 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2l2xd_a13e0873-9c6c-46d7-b0bf-4ef50c40a918/extract-utilities/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.643484 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2l2xd_a13e0873-9c6c-46d7-b0bf-4ef50c40a918/extract-content/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.744729 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2l2xd_a13e0873-9c6c-46d7-b0bf-4ef50c40a918/registry-server/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.782780 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-m586v_e86ee6f4-c5ee-40dd-8e60-977add936dc1/marketplace-operator/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.854553 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9jq_991327ed-0ad5-4161-a218-598e50bbafe9/extract-utilities/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.043029 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9jq_991327ed-0ad5-4161-a218-598e50bbafe9/extract-content/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.052018 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9jq_991327ed-0ad5-4161-a218-598e50bbafe9/extract-utilities/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.068528 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9jq_991327ed-0ad5-4161-a218-598e50bbafe9/extract-content/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.368813 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9jq_991327ed-0ad5-4161-a218-598e50bbafe9/extract-utilities/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.383870 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4czr8_08bf2596-9393-42d3-9b76-461be3ee0c22/extract-utilities/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.425623 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9jq_991327ed-0ad5-4161-a218-598e50bbafe9/extract-content/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.479752 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9jq_991327ed-0ad5-4161-a218-598e50bbafe9/registry-server/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.589871 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4czr8_08bf2596-9393-42d3-9b76-461be3ee0c22/extract-content/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.590872 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4czr8_08bf2596-9393-42d3-9b76-461be3ee0c22/extract-utilities/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.614136 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4czr8_08bf2596-9393-42d3-9b76-461be3ee0c22/extract-content/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.764524 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4czr8_08bf2596-9393-42d3-9b76-461be3ee0c22/extract-content/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.784543 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4czr8_08bf2596-9393-42d3-9b76-461be3ee0c22/extract-utilities/0.log" Mar 11 13:25:22 crc kubenswrapper[4816]: I0311 13:25:22.420230 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4czr8_08bf2596-9393-42d3-9b76-461be3ee0c22/registry-server/0.log" Mar 11 13:25:27 crc kubenswrapper[4816]: I0311 13:25:27.130327 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:25:27 crc kubenswrapper[4816]: E0311 13:25:27.130873 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:25:38 crc kubenswrapper[4816]: I0311 13:25:38.131159 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:25:38 crc kubenswrapper[4816]: E0311 13:25:38.132076 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:25:50 crc kubenswrapper[4816]: I0311 13:25:50.131557 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:25:50 crc kubenswrapper[4816]: E0311 13:25:50.132558 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:25:50 crc kubenswrapper[4816]: I0311 13:25:50.853753 4816 scope.go:117] "RemoveContainer" containerID="3c47893cabbfc635edaea2ea48266ffc815a61e1e094878326c38fe6119ee1b9" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.159019 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553926-z2j68"] Mar 11 13:26:00 crc kubenswrapper[4816]: E0311 13:26:00.160093 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e898cc-ff3f-4b4e-8fd4-4a85d3934314" containerName="oc" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.160115 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e898cc-ff3f-4b4e-8fd4-4a85d3934314" containerName="oc" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.160785 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e898cc-ff3f-4b4e-8fd4-4a85d3934314" containerName="oc" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.161486 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553926-z2j68" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.165350 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.165564 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.165970 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.183753 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn46t\" (UniqueName: \"kubernetes.io/projected/bb680b80-e315-429b-abf6-ff316b5086d2-kube-api-access-gn46t\") pod \"auto-csr-approver-29553926-z2j68\" (UID: \"bb680b80-e315-429b-abf6-ff316b5086d2\") " pod="openshift-infra/auto-csr-approver-29553926-z2j68" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.184042 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553926-z2j68"] Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.285348 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn46t\" (UniqueName: \"kubernetes.io/projected/bb680b80-e315-429b-abf6-ff316b5086d2-kube-api-access-gn46t\") pod \"auto-csr-approver-29553926-z2j68\" (UID: \"bb680b80-e315-429b-abf6-ff316b5086d2\") " pod="openshift-infra/auto-csr-approver-29553926-z2j68" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.309598 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn46t\" (UniqueName: \"kubernetes.io/projected/bb680b80-e315-429b-abf6-ff316b5086d2-kube-api-access-gn46t\") pod \"auto-csr-approver-29553926-z2j68\" (UID: \"bb680b80-e315-429b-abf6-ff316b5086d2\") " pod="openshift-infra/auto-csr-approver-29553926-z2j68" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.499636 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553926-z2j68" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.788077 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553926-z2j68"] Mar 11 13:26:01 crc kubenswrapper[4816]: I0311 13:26:01.415529 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553926-z2j68" event={"ID":"bb680b80-e315-429b-abf6-ff316b5086d2","Type":"ContainerStarted","Data":"65541b39ae0514d6122a7c1a032d1898b30e5c799b8a8100a8cab098ffcc87ce"} Mar 11 13:26:03 crc kubenswrapper[4816]: I0311 13:26:03.436074 4816 generic.go:334] "Generic (PLEG): container finished" podID="bb680b80-e315-429b-abf6-ff316b5086d2" containerID="2b01fcc246d768dc9f3a808039fe09e1a0fa481d4f86fec7e3628f8562f3e719" exitCode=0 Mar 11 13:26:03 crc kubenswrapper[4816]: I0311 13:26:03.436267 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553926-z2j68" event={"ID":"bb680b80-e315-429b-abf6-ff316b5086d2","Type":"ContainerDied","Data":"2b01fcc246d768dc9f3a808039fe09e1a0fa481d4f86fec7e3628f8562f3e719"} Mar 11 13:26:04 crc kubenswrapper[4816]: I0311 13:26:04.807672 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553926-z2j68" Mar 11 13:26:04 crc kubenswrapper[4816]: I0311 13:26:04.857760 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn46t\" (UniqueName: \"kubernetes.io/projected/bb680b80-e315-429b-abf6-ff316b5086d2-kube-api-access-gn46t\") pod \"bb680b80-e315-429b-abf6-ff316b5086d2\" (UID: \"bb680b80-e315-429b-abf6-ff316b5086d2\") " Mar 11 13:26:04 crc kubenswrapper[4816]: I0311 13:26:04.863220 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb680b80-e315-429b-abf6-ff316b5086d2-kube-api-access-gn46t" (OuterVolumeSpecName: "kube-api-access-gn46t") pod "bb680b80-e315-429b-abf6-ff316b5086d2" (UID: "bb680b80-e315-429b-abf6-ff316b5086d2"). InnerVolumeSpecName "kube-api-access-gn46t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:26:04 crc kubenswrapper[4816]: I0311 13:26:04.959944 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn46t\" (UniqueName: \"kubernetes.io/projected/bb680b80-e315-429b-abf6-ff316b5086d2-kube-api-access-gn46t\") on node \"crc\" DevicePath \"\"" Mar 11 13:26:05 crc kubenswrapper[4816]: I0311 13:26:05.131497 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:26:05 crc kubenswrapper[4816]: E0311 13:26:05.131886 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:26:05 crc kubenswrapper[4816]: I0311 13:26:05.455381 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553926-z2j68" event={"ID":"bb680b80-e315-429b-abf6-ff316b5086d2","Type":"ContainerDied","Data":"65541b39ae0514d6122a7c1a032d1898b30e5c799b8a8100a8cab098ffcc87ce"} Mar 11 13:26:05 crc kubenswrapper[4816]: I0311 13:26:05.455433 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553926-z2j68" Mar 11 13:26:05 crc kubenswrapper[4816]: I0311 13:26:05.455444 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65541b39ae0514d6122a7c1a032d1898b30e5c799b8a8100a8cab098ffcc87ce" Mar 11 13:26:05 crc kubenswrapper[4816]: I0311 13:26:05.884003 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553920-h6hh5"] Mar 11 13:26:05 crc kubenswrapper[4816]: I0311 13:26:05.895121 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553920-h6hh5"] Mar 11 13:26:06 crc kubenswrapper[4816]: I0311 13:26:06.148244 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="193e1468-2f5b-4e66-94f3-a7fc184c7e01" path="/var/lib/kubelet/pods/193e1468-2f5b-4e66-94f3-a7fc184c7e01/volumes" Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.774669 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h5lmg"] Mar 11 13:26:09 crc kubenswrapper[4816]: E0311 13:26:09.775564 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb680b80-e315-429b-abf6-ff316b5086d2" containerName="oc" Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.775592 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb680b80-e315-429b-abf6-ff316b5086d2" containerName="oc" Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.775915 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb680b80-e315-429b-abf6-ff316b5086d2" containerName="oc" Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.777944 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.809665 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h5lmg"] Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.944063 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-utilities\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.944141 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksqf8\" (UniqueName: \"kubernetes.io/projected/f5ab741b-37be-41a8-ac90-39c44e1c3cce-kube-api-access-ksqf8\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.944233 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-catalog-content\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.045208 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-catalog-content\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.045324 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-utilities\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.045349 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqf8\" (UniqueName: \"kubernetes.io/projected/f5ab741b-37be-41a8-ac90-39c44e1c3cce-kube-api-access-ksqf8\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.045853 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-catalog-content\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.045907 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-utilities\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.084365 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksqf8\" (UniqueName: \"kubernetes.io/projected/f5ab741b-37be-41a8-ac90-39c44e1c3cce-kube-api-access-ksqf8\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.111055 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.660358 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h5lmg"] Mar 11 13:26:11 crc kubenswrapper[4816]: I0311 13:26:11.507481 4816 generic.go:334] "Generic (PLEG): container finished" podID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerID="d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006" exitCode=0 Mar 11 13:26:11 crc kubenswrapper[4816]: I0311 13:26:11.507631 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5lmg" event={"ID":"f5ab741b-37be-41a8-ac90-39c44e1c3cce","Type":"ContainerDied","Data":"d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006"} Mar 11 13:26:11 crc kubenswrapper[4816]: I0311 13:26:11.507897 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5lmg" event={"ID":"f5ab741b-37be-41a8-ac90-39c44e1c3cce","Type":"ContainerStarted","Data":"88705377053d43154d3881339c36623191d31a836df133a4f330805ec483a271"} Mar 11 13:26:12 crc kubenswrapper[4816]: I0311 13:26:12.523546 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5lmg" event={"ID":"f5ab741b-37be-41a8-ac90-39c44e1c3cce","Type":"ContainerStarted","Data":"4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378"} Mar 11 13:26:13 crc kubenswrapper[4816]: I0311 13:26:13.538657 4816 generic.go:334] "Generic (PLEG): container finished" podID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerID="4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378" exitCode=0 Mar 11 13:26:13 crc kubenswrapper[4816]: I0311 13:26:13.538714 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5lmg" event={"ID":"f5ab741b-37be-41a8-ac90-39c44e1c3cce","Type":"ContainerDied","Data":"4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378"} Mar 11 13:26:15 crc kubenswrapper[4816]: I0311 13:26:15.566937 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5lmg" event={"ID":"f5ab741b-37be-41a8-ac90-39c44e1c3cce","Type":"ContainerStarted","Data":"1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd"} Mar 11 13:26:15 crc kubenswrapper[4816]: I0311 13:26:15.592101 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h5lmg" podStartSLOduration=3.562510782 podStartE2EDuration="6.592085642s" podCreationTimestamp="2026-03-11 13:26:09 +0000 UTC" firstStartedPulling="2026-03-11 13:26:11.512345727 +0000 UTC m=+5258.103609724" lastFinishedPulling="2026-03-11 13:26:14.541920607 +0000 UTC m=+5261.133184584" observedRunningTime="2026-03-11 13:26:15.587636507 +0000 UTC m=+5262.178900474" watchObservedRunningTime="2026-03-11 13:26:15.592085642 +0000 UTC m=+5262.183349609" Mar 11 13:26:17 crc kubenswrapper[4816]: I0311 13:26:17.131712 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:26:17 crc kubenswrapper[4816]: E0311 13:26:17.132576 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:26:20 crc kubenswrapper[4816]: I0311 13:26:20.111218 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:20 crc kubenswrapper[4816]: I0311 13:26:20.111669 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:20 crc kubenswrapper[4816]: I0311 13:26:20.191460 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:20 crc kubenswrapper[4816]: I0311 13:26:20.668137 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:20 crc kubenswrapper[4816]: I0311 13:26:20.743696 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h5lmg"] Mar 11 13:26:22 crc kubenswrapper[4816]: I0311 13:26:22.653987 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h5lmg" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="registry-server" containerID="cri-o://1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd" gracePeriod=2 Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.097831 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.235123 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-utilities\") pod \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.235327 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-catalog-content\") pod \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.235385 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksqf8\" (UniqueName: \"kubernetes.io/projected/f5ab741b-37be-41a8-ac90-39c44e1c3cce-kube-api-access-ksqf8\") pod \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.236306 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-utilities" (OuterVolumeSpecName: "utilities") pod "f5ab741b-37be-41a8-ac90-39c44e1c3cce" (UID: "f5ab741b-37be-41a8-ac90-39c44e1c3cce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.241479 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ab741b-37be-41a8-ac90-39c44e1c3cce-kube-api-access-ksqf8" (OuterVolumeSpecName: "kube-api-access-ksqf8") pod "f5ab741b-37be-41a8-ac90-39c44e1c3cce" (UID: "f5ab741b-37be-41a8-ac90-39c44e1c3cce"). InnerVolumeSpecName "kube-api-access-ksqf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.337238 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.337629 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksqf8\" (UniqueName: \"kubernetes.io/projected/f5ab741b-37be-41a8-ac90-39c44e1c3cce-kube-api-access-ksqf8\") on node \"crc\" DevicePath \"\"" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.658606 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5ab741b-37be-41a8-ac90-39c44e1c3cce" (UID: "f5ab741b-37be-41a8-ac90-39c44e1c3cce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.667431 4816 generic.go:334] "Generic (PLEG): container finished" podID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerID="1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd" exitCode=0 Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.667515 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.667526 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5lmg" event={"ID":"f5ab741b-37be-41a8-ac90-39c44e1c3cce","Type":"ContainerDied","Data":"1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd"} Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.667969 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5lmg" event={"ID":"f5ab741b-37be-41a8-ac90-39c44e1c3cce","Type":"ContainerDied","Data":"88705377053d43154d3881339c36623191d31a836df133a4f330805ec483a271"} Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.668015 4816 scope.go:117] "RemoveContainer" containerID="1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.699791 4816 scope.go:117] "RemoveContainer" containerID="4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.731871 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h5lmg"] Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.742305 4816 scope.go:117] "RemoveContainer" containerID="d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.742929 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.749458 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h5lmg"] Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.774316 4816 scope.go:117] "RemoveContainer" containerID="1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd" Mar 11 13:26:23 crc kubenswrapper[4816]: E0311 13:26:23.775929 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd\": container with ID starting with 1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd not found: ID does not exist" containerID="1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.776016 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd"} err="failed to get container status \"1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd\": rpc error: code = NotFound desc = could not find container \"1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd\": container with ID starting with 1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd not found: ID does not exist" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.776062 4816 scope.go:117] "RemoveContainer" containerID="4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378" Mar 11 13:26:23 crc kubenswrapper[4816]: E0311 13:26:23.776659 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378\": container with ID starting with 4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378 not found: ID does not exist" containerID="4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.776720 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378"} err="failed to get container status \"4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378\": rpc error: code = NotFound desc = could not find container \"4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378\": container with ID starting with 4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378 not found: ID does not exist" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.776771 4816 scope.go:117] "RemoveContainer" containerID="d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006" Mar 11 13:26:23 crc kubenswrapper[4816]: E0311 13:26:23.777308 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006\": container with ID starting with d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006 not found: ID does not exist" containerID="d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.777409 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006"} err="failed to get container status \"d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006\": rpc error: code = NotFound desc = could not find container \"d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006\": container with ID starting with d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006 not found: ID does not exist" Mar 11 13:26:24 crc kubenswrapper[4816]: I0311 13:26:24.149780 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" path="/var/lib/kubelet/pods/f5ab741b-37be-41a8-ac90-39c44e1c3cce/volumes" Mar 11 13:26:30 crc kubenswrapper[4816]: I0311 13:26:30.130773 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:26:30 crc kubenswrapper[4816]: E0311 13:26:30.131424 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:26:36 crc kubenswrapper[4816]: I0311 13:26:36.702908 4816 generic.go:334] "Generic (PLEG): container finished" podID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerID="275f5d5dbbc9a09455fcf3424925e09fcf25e8e9a31a9d4c2991fb94c2921996" exitCode=0 Mar 11 13:26:36 crc kubenswrapper[4816]: I0311 13:26:36.703079 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" event={"ID":"240c7704-66e9-4d5b-9b4f-cf8a80365c26","Type":"ContainerDied","Data":"275f5d5dbbc9a09455fcf3424925e09fcf25e8e9a31a9d4c2991fb94c2921996"} Mar 11 13:26:36 crc kubenswrapper[4816]: I0311 13:26:36.704282 4816 scope.go:117] "RemoveContainer" containerID="275f5d5dbbc9a09455fcf3424925e09fcf25e8e9a31a9d4c2991fb94c2921996" Mar 11 13:26:37 crc kubenswrapper[4816]: I0311 13:26:37.421077 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rbcl7_must-gather-k8xh5_240c7704-66e9-4d5b-9b4f-cf8a80365c26/gather/0.log" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.265108 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jq5z6"] Mar 11 13:26:39 crc kubenswrapper[4816]: E0311 13:26:39.266343 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="registry-server" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.266360 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="registry-server" Mar 11 13:26:39 crc kubenswrapper[4816]: E0311 13:26:39.266372 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="extract-content" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.266378 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="extract-content" Mar 11 13:26:39 crc kubenswrapper[4816]: E0311 13:26:39.266406 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="extract-utilities" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.266413 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="extract-utilities" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.266575 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="registry-server" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.267872 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.279586 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jq5z6"] Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.425375 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfv5l\" (UniqueName: \"kubernetes.io/projected/34d81fa0-710a-4fdd-b98b-bd88b80a7343-kube-api-access-bfv5l\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.425489 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-catalog-content\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.425803 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-utilities\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.527217 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfv5l\" (UniqueName: \"kubernetes.io/projected/34d81fa0-710a-4fdd-b98b-bd88b80a7343-kube-api-access-bfv5l\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.527309 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-catalog-content\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.527377 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-utilities\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.527794 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-utilities\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.527875 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-catalog-content\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.563054 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfv5l\" (UniqueName: \"kubernetes.io/projected/34d81fa0-710a-4fdd-b98b-bd88b80a7343-kube-api-access-bfv5l\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.592660 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.869859 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jq5z6"] Mar 11 13:26:40 crc kubenswrapper[4816]: I0311 13:26:40.742433 4816 generic.go:334] "Generic (PLEG): container finished" podID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerID="5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92" exitCode=0 Mar 11 13:26:40 crc kubenswrapper[4816]: I0311 13:26:40.742545 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq5z6" event={"ID":"34d81fa0-710a-4fdd-b98b-bd88b80a7343","Type":"ContainerDied","Data":"5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92"} Mar 11 13:26:40 crc kubenswrapper[4816]: I0311 13:26:40.742677 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq5z6" event={"ID":"34d81fa0-710a-4fdd-b98b-bd88b80a7343","Type":"ContainerStarted","Data":"b5442a18316c3fb9c79391159649faad4dab8cdd84c8cce704995af04a204fca"} Mar 11 13:26:41 crc kubenswrapper[4816]: I0311 13:26:41.754872 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq5z6" event={"ID":"34d81fa0-710a-4fdd-b98b-bd88b80a7343","Type":"ContainerStarted","Data":"22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573"} Mar 11 13:26:42 crc kubenswrapper[4816]: I0311 13:26:42.768296 4816 generic.go:334] "Generic (PLEG): container finished" podID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerID="22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573" exitCode=0 Mar 11 13:26:42 crc kubenswrapper[4816]: I0311 13:26:42.768593 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq5z6" event={"ID":"34d81fa0-710a-4fdd-b98b-bd88b80a7343","Type":"ContainerDied","Data":"22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573"} Mar 11 13:26:43 crc kubenswrapper[4816]: I0311 13:26:43.785506 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq5z6" event={"ID":"34d81fa0-710a-4fdd-b98b-bd88b80a7343","Type":"ContainerStarted","Data":"5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3"} Mar 11 13:26:43 crc kubenswrapper[4816]: I0311 13:26:43.815988 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jq5z6" podStartSLOduration=2.306411344 podStartE2EDuration="4.815965128s" podCreationTimestamp="2026-03-11 13:26:39 +0000 UTC" firstStartedPulling="2026-03-11 13:26:40.74404604 +0000 UTC m=+5287.335310007" lastFinishedPulling="2026-03-11 13:26:43.253599814 +0000 UTC m=+5289.844863791" observedRunningTime="2026-03-11 13:26:43.810951907 +0000 UTC m=+5290.402215944" watchObservedRunningTime="2026-03-11 13:26:43.815965128 +0000 UTC m=+5290.407229115" Mar 11 13:26:44 crc kubenswrapper[4816]: I0311 13:26:44.135654 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:26:44 crc kubenswrapper[4816]: E0311 13:26:44.136102 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:26:45 crc kubenswrapper[4816]: I0311 13:26:45.526617 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rbcl7/must-gather-k8xh5"] Mar 11 13:26:45 crc kubenswrapper[4816]: I0311 13:26:45.527604 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerName="copy" containerID="cri-o://54930770a8edcb6e0930ad6af1934aac20a12f0e3525f98e64859acac70909c5" gracePeriod=2 Mar 11 13:26:45 crc kubenswrapper[4816]: I0311 13:26:45.535315 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rbcl7/must-gather-k8xh5"] Mar 11 13:26:45 crc kubenswrapper[4816]: I0311 13:26:45.813861 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rbcl7_must-gather-k8xh5_240c7704-66e9-4d5b-9b4f-cf8a80365c26/copy/0.log" Mar 11 13:26:45 crc kubenswrapper[4816]: I0311 13:26:45.814549 4816 generic.go:334] "Generic (PLEG): container finished" podID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerID="54930770a8edcb6e0930ad6af1934aac20a12f0e3525f98e64859acac70909c5" exitCode=143 Mar 11 13:26:45 crc kubenswrapper[4816]: I0311 13:26:45.972853 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rbcl7_must-gather-k8xh5_240c7704-66e9-4d5b-9b4f-cf8a80365c26/copy/0.log" Mar 11 13:26:45 crc kubenswrapper[4816]: I0311 13:26:45.973390 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.122318 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240c7704-66e9-4d5b-9b4f-cf8a80365c26-must-gather-output\") pod \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.122404 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4wc7\" (UniqueName: \"kubernetes.io/projected/240c7704-66e9-4d5b-9b4f-cf8a80365c26-kube-api-access-g4wc7\") pod \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.143217 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/240c7704-66e9-4d5b-9b4f-cf8a80365c26-kube-api-access-g4wc7" (OuterVolumeSpecName: "kube-api-access-g4wc7") pod "240c7704-66e9-4d5b-9b4f-cf8a80365c26" (UID: "240c7704-66e9-4d5b-9b4f-cf8a80365c26"). InnerVolumeSpecName "kube-api-access-g4wc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.224620 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4wc7\" (UniqueName: \"kubernetes.io/projected/240c7704-66e9-4d5b-9b4f-cf8a80365c26-kube-api-access-g4wc7\") on node \"crc\" DevicePath \"\"" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.231158 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/240c7704-66e9-4d5b-9b4f-cf8a80365c26-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "240c7704-66e9-4d5b-9b4f-cf8a80365c26" (UID: "240c7704-66e9-4d5b-9b4f-cf8a80365c26"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.326076 4816 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240c7704-66e9-4d5b-9b4f-cf8a80365c26-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.825055 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rbcl7_must-gather-k8xh5_240c7704-66e9-4d5b-9b4f-cf8a80365c26/copy/0.log" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.826418 4816 scope.go:117] "RemoveContainer" containerID="54930770a8edcb6e0930ad6af1934aac20a12f0e3525f98e64859acac70909c5" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.826553 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.843954 4816 scope.go:117] "RemoveContainer" containerID="275f5d5dbbc9a09455fcf3424925e09fcf25e8e9a31a9d4c2991fb94c2921996" Mar 11 13:26:48 crc kubenswrapper[4816]: I0311 13:26:48.141001 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" path="/var/lib/kubelet/pods/240c7704-66e9-4d5b-9b4f-cf8a80365c26/volumes" Mar 11 13:26:49 crc kubenswrapper[4816]: I0311 13:26:49.592935 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:49 crc kubenswrapper[4816]: I0311 13:26:49.593491 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:50 crc kubenswrapper[4816]: I0311 13:26:50.640771 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jq5z6" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="registry-server" probeResult="failure" output=< Mar 11 13:26:50 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 13:26:50 crc kubenswrapper[4816]: > Mar 11 13:26:51 crc kubenswrapper[4816]: I0311 13:26:51.319384 4816 scope.go:117] "RemoveContainer" containerID="4a3cade9d3e8a7bb5a9e71032c96de36c1178b4f7d16ddbd543510f67b6be155" Mar 11 13:26:55 crc kubenswrapper[4816]: I0311 13:26:55.130799 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:26:55 crc kubenswrapper[4816]: E0311 13:26:55.131704 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:26:59 crc kubenswrapper[4816]: I0311 13:26:59.656580 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:59 crc kubenswrapper[4816]: I0311 13:26:59.731376 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:59 crc kubenswrapper[4816]: I0311 13:26:59.900100 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jq5z6"] Mar 11 13:27:00 crc kubenswrapper[4816]: I0311 13:27:00.954200 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jq5z6" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="registry-server" containerID="cri-o://5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3" gracePeriod=2 Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.385288 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.560437 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-catalog-content\") pod \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.560527 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfv5l\" (UniqueName: \"kubernetes.io/projected/34d81fa0-710a-4fdd-b98b-bd88b80a7343-kube-api-access-bfv5l\") pod \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.560661 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-utilities\") pod \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.561959 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-utilities" (OuterVolumeSpecName: "utilities") pod "34d81fa0-710a-4fdd-b98b-bd88b80a7343" (UID: "34d81fa0-710a-4fdd-b98b-bd88b80a7343"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.569157 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d81fa0-710a-4fdd-b98b-bd88b80a7343-kube-api-access-bfv5l" (OuterVolumeSpecName: "kube-api-access-bfv5l") pod "34d81fa0-710a-4fdd-b98b-bd88b80a7343" (UID: "34d81fa0-710a-4fdd-b98b-bd88b80a7343"). InnerVolumeSpecName "kube-api-access-bfv5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.662097 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.662141 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfv5l\" (UniqueName: \"kubernetes.io/projected/34d81fa0-710a-4fdd-b98b-bd88b80a7343-kube-api-access-bfv5l\") on node \"crc\" DevicePath \"\"" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.727122 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34d81fa0-710a-4fdd-b98b-bd88b80a7343" (UID: "34d81fa0-710a-4fdd-b98b-bd88b80a7343"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.763754 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.966669 4816 generic.go:334] "Generic (PLEG): container finished" podID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerID="5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3" exitCode=0 Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.966737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq5z6" event={"ID":"34d81fa0-710a-4fdd-b98b-bd88b80a7343","Type":"ContainerDied","Data":"5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3"} Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.966782 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.966806 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq5z6" event={"ID":"34d81fa0-710a-4fdd-b98b-bd88b80a7343","Type":"ContainerDied","Data":"b5442a18316c3fb9c79391159649faad4dab8cdd84c8cce704995af04a204fca"} Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.966830 4816 scope.go:117] "RemoveContainer" containerID="5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.986659 4816 scope.go:117] "RemoveContainer" containerID="22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.018901 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jq5z6"] Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.021482 4816 scope.go:117] "RemoveContainer" containerID="5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.026765 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jq5z6"] Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.042610 4816 scope.go:117] "RemoveContainer" containerID="5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3" Mar 11 13:27:02 crc kubenswrapper[4816]: E0311 13:27:02.042950 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3\": container with ID starting with 5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3 not found: ID does not exist" containerID="5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.042980 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3"} err="failed to get container status \"5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3\": rpc error: code = NotFound desc = could not find container \"5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3\": container with ID starting with 5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3 not found: ID does not exist" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.043001 4816 scope.go:117] "RemoveContainer" containerID="22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573" Mar 11 13:27:02 crc kubenswrapper[4816]: E0311 13:27:02.043366 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573\": container with ID starting with 22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573 not found: ID does not exist" containerID="22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.043388 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573"} err="failed to get container status \"22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573\": rpc error: code = NotFound desc = could not find container \"22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573\": container with ID starting with 22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573 not found: ID does not exist" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.043401 4816 scope.go:117] "RemoveContainer" containerID="5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92" Mar 11 13:27:02 crc kubenswrapper[4816]: E0311 13:27:02.043778 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92\": container with ID starting with 5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92 not found: ID does not exist" containerID="5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.043799 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92"} err="failed to get container status \"5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92\": rpc error: code = NotFound desc = could not find container \"5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92\": container with ID starting with 5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92 not found: ID does not exist" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.141421 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" path="/var/lib/kubelet/pods/34d81fa0-710a-4fdd-b98b-bd88b80a7343/volumes" Mar 11 13:27:10 crc kubenswrapper[4816]: I0311 13:27:10.135418 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:27:10 crc kubenswrapper[4816]: E0311 13:27:10.136375 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:27:23 crc kubenswrapper[4816]: I0311 13:27:23.130946 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:27:23 crc kubenswrapper[4816]: E0311 13:27:23.131852 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:27:36 crc kubenswrapper[4816]: I0311 13:27:36.130433 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:27:36 crc kubenswrapper[4816]: E0311 13:27:36.131338 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:27:49 crc kubenswrapper[4816]: I0311 13:27:49.130379 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:27:49 crc kubenswrapper[4816]: E0311 13:27:49.131296 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.156854 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553928-j48rh"] Mar 11 13:28:00 crc kubenswrapper[4816]: E0311 13:28:00.157628 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="extract-content" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157640 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="extract-content" Mar 11 13:28:00 crc kubenswrapper[4816]: E0311 13:28:00.157651 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerName="copy" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157657 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerName="copy" Mar 11 13:28:00 crc kubenswrapper[4816]: E0311 13:28:00.157670 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="registry-server" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157676 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="registry-server" Mar 11 13:28:00 crc kubenswrapper[4816]: E0311 13:28:00.157685 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerName="gather" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157691 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerName="gather" Mar 11 13:28:00 crc kubenswrapper[4816]: E0311 13:28:00.157702 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="extract-utilities" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157707 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="extract-utilities" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157863 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerName="copy" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157879 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerName="gather" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157890 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="registry-server" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.158297 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553928-j48rh" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.161084 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.161301 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.161369 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.183079 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553928-j48rh"] Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.294440 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdhc6\" (UniqueName: \"kubernetes.io/projected/31fec07e-a834-4a80-9534-cfa4b1939ffc-kube-api-access-rdhc6\") pod \"auto-csr-approver-29553928-j48rh\" (UID: \"31fec07e-a834-4a80-9534-cfa4b1939ffc\") " pod="openshift-infra/auto-csr-approver-29553928-j48rh" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.395762 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdhc6\" (UniqueName: \"kubernetes.io/projected/31fec07e-a834-4a80-9534-cfa4b1939ffc-kube-api-access-rdhc6\") pod \"auto-csr-approver-29553928-j48rh\" (UID: \"31fec07e-a834-4a80-9534-cfa4b1939ffc\") " pod="openshift-infra/auto-csr-approver-29553928-j48rh" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.429286 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdhc6\" (UniqueName: \"kubernetes.io/projected/31fec07e-a834-4a80-9534-cfa4b1939ffc-kube-api-access-rdhc6\") pod \"auto-csr-approver-29553928-j48rh\" (UID: \"31fec07e-a834-4a80-9534-cfa4b1939ffc\") " pod="openshift-infra/auto-csr-approver-29553928-j48rh" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.486872 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553928-j48rh" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.816078 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553928-j48rh"] Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.824171 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 13:28:01 crc kubenswrapper[4816]: I0311 13:28:01.534295 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553928-j48rh" event={"ID":"31fec07e-a834-4a80-9534-cfa4b1939ffc","Type":"ContainerStarted","Data":"e22f86625762b25fb3daad30bead92f87f9412d99c562ef147e5e5a37a8d3809"} Mar 11 13:28:02 crc kubenswrapper[4816]: I0311 13:28:02.541229 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553928-j48rh" event={"ID":"31fec07e-a834-4a80-9534-cfa4b1939ffc","Type":"ContainerStarted","Data":"2d431be4a15d84bda7a012602744dbc22889b621168f3e90771a1c976b8807e5"} Mar 11 13:28:02 crc kubenswrapper[4816]: I0311 13:28:02.556774 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553928-j48rh" podStartSLOduration=1.299542255 podStartE2EDuration="2.556756155s" podCreationTimestamp="2026-03-11 13:28:00 +0000 UTC" firstStartedPulling="2026-03-11 13:28:00.823795273 +0000 UTC m=+5367.415059230" lastFinishedPulling="2026-03-11 13:28:02.081009133 +0000 UTC m=+5368.672273130" observedRunningTime="2026-03-11 13:28:02.554692817 +0000 UTC m=+5369.145956784" watchObservedRunningTime="2026-03-11 13:28:02.556756155 +0000 UTC m=+5369.148020122" Mar 11 13:28:03 crc kubenswrapper[4816]: I0311 13:28:03.551841 4816 generic.go:334] "Generic (PLEG): container finished" podID="31fec07e-a834-4a80-9534-cfa4b1939ffc" containerID="2d431be4a15d84bda7a012602744dbc22889b621168f3e90771a1c976b8807e5" exitCode=0 Mar 11 13:28:03 crc kubenswrapper[4816]: I0311 13:28:03.551910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553928-j48rh" event={"ID":"31fec07e-a834-4a80-9534-cfa4b1939ffc","Type":"ContainerDied","Data":"2d431be4a15d84bda7a012602744dbc22889b621168f3e90771a1c976b8807e5"} Mar 11 13:28:04 crc kubenswrapper[4816]: I0311 13:28:04.139711 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:28:04 crc kubenswrapper[4816]: E0311 13:28:04.140415 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:28:04 crc kubenswrapper[4816]: I0311 13:28:04.948814 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553928-j48rh" Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.071733 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdhc6\" (UniqueName: \"kubernetes.io/projected/31fec07e-a834-4a80-9534-cfa4b1939ffc-kube-api-access-rdhc6\") pod \"31fec07e-a834-4a80-9534-cfa4b1939ffc\" (UID: \"31fec07e-a834-4a80-9534-cfa4b1939ffc\") " Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.081140 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31fec07e-a834-4a80-9534-cfa4b1939ffc-kube-api-access-rdhc6" (OuterVolumeSpecName: "kube-api-access-rdhc6") pod "31fec07e-a834-4a80-9534-cfa4b1939ffc" (UID: "31fec07e-a834-4a80-9534-cfa4b1939ffc"). InnerVolumeSpecName "kube-api-access-rdhc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.174155 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdhc6\" (UniqueName: \"kubernetes.io/projected/31fec07e-a834-4a80-9534-cfa4b1939ffc-kube-api-access-rdhc6\") on node \"crc\" DevicePath \"\"" Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.574161 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553928-j48rh" event={"ID":"31fec07e-a834-4a80-9534-cfa4b1939ffc","Type":"ContainerDied","Data":"e22f86625762b25fb3daad30bead92f87f9412d99c562ef147e5e5a37a8d3809"} Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.574230 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e22f86625762b25fb3daad30bead92f87f9412d99c562ef147e5e5a37a8d3809" Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.574336 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553928-j48rh" Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.659706 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553922-l2chb"] Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.666233 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553922-l2chb"] Mar 11 13:28:06 crc kubenswrapper[4816]: I0311 13:28:06.145568 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a09c6fad-26f7-4ea2-84fc-5d2efb86fd02" path="/var/lib/kubelet/pods/a09c6fad-26f7-4ea2-84fc-5d2efb86fd02/volumes" Mar 11 13:28:18 crc kubenswrapper[4816]: I0311 13:28:18.131000 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:28:18 crc kubenswrapper[4816]: E0311 13:28:18.131568 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:28:33 crc kubenswrapper[4816]: I0311 13:28:33.130613 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:28:33 crc kubenswrapper[4816]: E0311 13:28:33.131762 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a"